Hi all,
This is my current rig:
-AMD Athlon64 3200+ (2 GHz) OC to 2.4 GHz with Scythe KAMA HSF
-ABIT AN8 Ultra nForce 4 Ultra Socket 939
-2x512 MB Geil DDR400 in Dual channel @ stock 400 MHz (set at 333 in BIOS)
-Gigabyte NX66128-DP2 GeForce 6600 128 MB 450/1000 @ 510/1130 (more about this one later...)
-LG 1932P 19'' LCD 1280x1024 @ 60 Hz
-Samsung SP2504C 250 GB SATA 3Gb/s 7200 RPM 8 MB buffer NCQ
-Samsung SP1213N 120 GB ATA133 7200 RPM 8 MB buffer
-Creative Audigy2 ZS
+ other stuff. PSU intentionally ommited, to be replaced anyway. Running XP SP2 for now, but will probably go dual booting with Vista 32 bit soon (this rig got a 4.5 score in Vista and ran it pretty good, quite fast actually, but then again, it was a clean install, without AV, firewall and all the rest).
Up until now, this rig was both an HTPC AND a gaming rig, being pretty good at both, given the tight budget and the timeframe of the build (August 2005).
As a HTPC, the ABIT MB, with its uGuru BIOS & Windows software, allows for a very quiet operation despite the total of 6 fans, all - including PSU & video card with aftermarket fans connected to the MB fan headers - being fully monitored & controlled via uGuru (undervolted during idling & normal operation, running up to full 12 V RPM depending on load & temps).
As a gaming rig, the catch is the fact that the apparent vanilla GeForce 6600 from Gigabyte is in fact an underclocked 6600 GT, running the same graphics core at 450 MHz as opposed to the 500 MHz on the GT, while the VRAM is the same Samsung GDDR3 2ns 128 GB @ 1000 MHz (2x500) as on the GT. I am still quite proud of being able to snatch this particular "GT" for 35 bucks less than a "real" GT ;-) As mentioned, this card is now @ 510 MHz core / 1130 MHz VRAM (should get higher, but not with the current cooling).
I said "up until now" because this rig handled pretty much all games thrown at it in the first year and a half with max settings at first, then with some on medium and so on. It even played FarCry on max with HDR & 8xAF in 1024x768 on my old CRT! I skipped Bioshock, STALKER & some newer titles (as in not even trying to run them), with the last title played being Half-Life 2 Ep. 2 (no HDR here though, even if Lost Coast was playable with it turned on).
And then it came Crysis SP DEMO. Let's just say that it would be "playable" - and even with some settings on Medium or High - only if 800x600 would qualify as such on a 1280x1024 LCD panel. Well, at least I saw, @ a couple of FPS ;-) what it would look like with all settings on High (DX9 mode). Pretty badass !
Damn, this post got this long and I didn't even get to asking you guys for an opinion on my current dilemma. Anyway, here is the thing: I would love to play Crysis (and those other recent titles I missed) at max settings on a new rig, but this is not possible ATM, for two reasons: tight budget (again) and non-existing hardware powerfull enough to run Crysis in DX10 mode, all settings Very High. So the new rig option is gonna have to wait (I think Nehalem with whatever graphics cards available then should do the trick). Thus, I only have one shot left: the 8800 GT. True, even with the high end quads and high performance memory in 4 Gigs configurations, the 8800 GT only pulls 35 FPS @ 1280x1024 (my gaming resolution) in DX9 with High settings in Crysis. But then again, the real bottleneck seems to be the video card, not the CPU or system memory. I know the final version of the game may show some improvement over the demo in this regard, but still, I would like to know your opinion: is my current configuration going to be a major bottleneck for a 8800 GT at 1280x1024? Do you guys think I will be able to reach those 35 FPS, or at least 30, in Crysis DX9 all High with this card in my current system (a new PSU with PCIe connector would also be needed). Would this upgrade be worth 300 bucks now (200-250 for the 8800 GT + the rest on a decent PSU - I'm thinking the likes of Corsair 450 VX), or should I wait 1 more year and build a new rig around Christmas 2008? I'm guessing if it is enough to play Crysis on High, it would also play upcoming titles like NFS ProStreet, BlackSite Area 51 etc. with medium to high image quality. Also, the added noise is another factor to consider; I will have to find a card that will either allow speed control for the fan via own driver/software, or - like I do now - connect its fan to a motherboard fan header.
Well, this is it. I'm sorry it ended up such a long post. Can't wait to hear your opinions on this.
Thanks.
P.S. I do not intend to try upgrading other components in this rig also, such as CPU or memory. Being on S939, it would cost me more than a new system at this point :-(
This is my current rig:
-AMD Athlon64 3200+ (2 GHz) OC to 2.4 GHz with Scythe KAMA HSF
-ABIT AN8 Ultra nForce 4 Ultra Socket 939
-2x512 MB Geil DDR400 in Dual channel @ stock 400 MHz (set at 333 in BIOS)
-Gigabyte NX66128-DP2 GeForce 6600 128 MB 450/1000 @ 510/1130 (more about this one later...)
-LG 1932P 19'' LCD 1280x1024 @ 60 Hz
-Samsung SP2504C 250 GB SATA 3Gb/s 7200 RPM 8 MB buffer NCQ
-Samsung SP1213N 120 GB ATA133 7200 RPM 8 MB buffer
-Creative Audigy2 ZS
+ other stuff. PSU intentionally ommited, to be replaced anyway. Running XP SP2 for now, but will probably go dual booting with Vista 32 bit soon (this rig got a 4.5 score in Vista and ran it pretty good, quite fast actually, but then again, it was a clean install, without AV, firewall and all the rest).
Up until now, this rig was both an HTPC AND a gaming rig, being pretty good at both, given the tight budget and the timeframe of the build (August 2005).
As a HTPC, the ABIT MB, with its uGuru BIOS & Windows software, allows for a very quiet operation despite the total of 6 fans, all - including PSU & video card with aftermarket fans connected to the MB fan headers - being fully monitored & controlled via uGuru (undervolted during idling & normal operation, running up to full 12 V RPM depending on load & temps).
As a gaming rig, the catch is the fact that the apparent vanilla GeForce 6600 from Gigabyte is in fact an underclocked 6600 GT, running the same graphics core at 450 MHz as opposed to the 500 MHz on the GT, while the VRAM is the same Samsung GDDR3 2ns 128 GB @ 1000 MHz (2x500) as on the GT. I am still quite proud of being able to snatch this particular "GT" for 35 bucks less than a "real" GT ;-) As mentioned, this card is now @ 510 MHz core / 1130 MHz VRAM (should get higher, but not with the current cooling).
I said "up until now" because this rig handled pretty much all games thrown at it in the first year and a half with max settings at first, then with some on medium and so on. It even played FarCry on max with HDR & 8xAF in 1024x768 on my old CRT! I skipped Bioshock, STALKER & some newer titles (as in not even trying to run them), with the last title played being Half-Life 2 Ep. 2 (no HDR here though, even if Lost Coast was playable with it turned on).
And then it came Crysis SP DEMO. Let's just say that it would be "playable" - and even with some settings on Medium or High - only if 800x600 would qualify as such on a 1280x1024 LCD panel. Well, at least I saw, @ a couple of FPS ;-) what it would look like with all settings on High (DX9 mode). Pretty badass !
Damn, this post got this long and I didn't even get to asking you guys for an opinion on my current dilemma. Anyway, here is the thing: I would love to play Crysis (and those other recent titles I missed) at max settings on a new rig, but this is not possible ATM, for two reasons: tight budget (again) and non-existing hardware powerfull enough to run Crysis in DX10 mode, all settings Very High. So the new rig option is gonna have to wait (I think Nehalem with whatever graphics cards available then should do the trick). Thus, I only have one shot left: the 8800 GT. True, even with the high end quads and high performance memory in 4 Gigs configurations, the 8800 GT only pulls 35 FPS @ 1280x1024 (my gaming resolution) in DX9 with High settings in Crysis. But then again, the real bottleneck seems to be the video card, not the CPU or system memory. I know the final version of the game may show some improvement over the demo in this regard, but still, I would like to know your opinion: is my current configuration going to be a major bottleneck for a 8800 GT at 1280x1024? Do you guys think I will be able to reach those 35 FPS, or at least 30, in Crysis DX9 all High with this card in my current system (a new PSU with PCIe connector would also be needed). Would this upgrade be worth 300 bucks now (200-250 for the 8800 GT + the rest on a decent PSU - I'm thinking the likes of Corsair 450 VX), or should I wait 1 more year and build a new rig around Christmas 2008? I'm guessing if it is enough to play Crysis on High, it would also play upcoming titles like NFS ProStreet, BlackSite Area 51 etc. with medium to high image quality. Also, the added noise is another factor to consider; I will have to find a card that will either allow speed control for the fan via own driver/software, or - like I do now - connect its fan to a motherboard fan header.
Well, this is it. I'm sorry it ended up such a long post. Can't wait to hear your opinions on this.
Thanks.
P.S. I do not intend to try upgrading other components in this rig also, such as CPU or memory. Being on S939, it would cost me more than a new system at this point :-(