Quantitative and qualitative comparison of overclocked ATi X1900XT and nVidia 7900GT

imported_ST

Senior member
Oct 10, 2004
733
0
0
Preface: Although there have been vast comparisons between X1900XT and 7900GT offerings, most of them have involved just plain performance numbers without much insight into image quality, which can be equally important. This brief post does not attempt to corral the endless number of configurations on overclocks and image options that are available for both solutions, but rather attempt to classify the performance from a visual standpoint in terms of framerates and quality with the more simplistic and achievable clocks and settings. Where there are gaps, as I'm sure there will be, I will do what I can to fill them in, but I too am bandwidth limited (Real life with work and a girlfriend who hates computers cannot readily be overcome with MHz! ). I'd like to thank Elfear for his contributions to this article and helping me on the ATI front.

Quantitative and qualitative comparison of overclocked ATi X1900XT and nVidia 7900GT

Part I - Oblivion performance and IQ benchmarks

Much has been made by ATI's excellent ATI X1900XT VPU
( http://www.anandtech.com/video/showdoc.aspx?i=2679 ) and for good reasons. With it's 625MHz stock core clock speeds, GDDR3 memory running at an aggregate 1450MHz data rate, and staggering 48 pixel shader architecture, not too mention very competitive pricing in the low $425 range, it has gone on to take the performance / value crown in the present day video arena. Combine this with folks readily achieving X1900XTX clocks or well beyond ( http://www.ocforums.com/showthread.php?t=440151 ), it is a monster of a card for fair entry price!

While the affordable $300 7900GT has been a hot selling commodity, its stock performance has been lackluster compared to ATI's X1xxx line of graphic cards. Spotting 175 MHz core clock speeds, as well 100 MHz memory clocks and 24 pixel shaders to its Canadian competitor in "base" form; the little NVIDIA champ doesn't bring much punch to the battle. But just like David vs. Goliath, fortunately its new 90nm GPU process brings forth a slingshot of an overclock to the table to try to topple its heavily favored arch-rival. The 7900GT volt mod ( http://forums.anandtech.com/messageview...atid=31&threadid=1847758&enterthread=y ) has brought a lot of controversy to the community in its simplicity as well as performance to this once mediocre platform; rather than ATIs more user friendly software adjustable voltage modifications, it require a hard (semi-permanent) solution via a conductive ink pen in order to achieve the same voltage affects. Although the long term affects of both overvoltage solutions remains to be seen in terms of reliability ,quite a few folks have tried and had much success with this 7900GT volt mod already (
http://www.xtremesystems.org/forums/showthread.php?t=92874 ). Make no mistake about it, not since the recent trends of AMD Opteron 165 CPUs overclocking to absurd frequencies have we seen such a performance delta between stock and modified.

In the first round of comparisons, we attempt to take a look at both platforms from a quantitative (performance) as well as qualitative (image quality) perspective under Oblivion, the hottest game to date. Sporting a software engine that has support for SM3.0 HDR affects, as well as immense shadow and texturing, it has really put a heavy emphasis on GPU performance. Two particularly benchmark methodologies will be
utilized:

1) First, stressing GPU raw horsepower, 2 sets of gameplay in a ~10 minute outdoor trek and ~5 minute run in an actual Oblivion Gate dungeon gameplay will both be FRAPed for framerates comparisons

2) Secondarily, an image quality shot will be made on the different platforms at a set point with different image quality settings in anti-aliasing, antistrophic filters, etc.

All setting will be performed at 1920x1080 resolutions for the time being, as it is the native resolution of my monitor. Note: Because I am doing the frapping in actual gameplay, there will be some variances in the tests. I would imagine a margin of error of ~5% for the scores due to the random nature executing each run in a real live game environment. This philosophy in testing follows Anandtech's own GPU comparion in Oblivion ( http://www.anandtech.com/video/showdoc.aspx?i=2746 ). Many of the tests were double, even triple checked to ensure accuracy. What should matter most to folks is the not necessarily the maximum framerate, but rather the average, and to some certain extent, the minium, as it will be more indicative of actual gameplay.

The 7900GT was first run with the "soft" auto overclock settings of 520MHz and 720 MHz respectively via Coolbits for comparison purposes. It was then volt-modded to 1.4V and again auto overclock set to 675 MHz Core and 875MHz Memory clocks with Coolbits and Powerstrip utility. Note: this was using an aftermarket Artic Cooling NV Silencer 5 (which most former 7800GT/GTX owners that upgraded already have) which is desirable for maximum overclocks. The stock "base" 7900GT HSF is small, somewhat noisy with its whiny fans, and does not cover the memory chips sufficiently. If you have the "factory overclocked" versions, thes have a better copper base, more quiet solutions, that also cover the memory areas.

The ATI X1900XT was overclocked via Catalyst Control Center's Overdrive utility automatically to the maximum of 655MHz Core and 792 MHz Memory clock too; not too shabby since it was beyond X1900XTX specifications. Note: further overclocking tools such as ATI Tools ( http://www.techpowerup.com/atitool/ ) would allow you to overclock even more by tweaking the clocks and voltages independent of the CCC controls. For this particular benchmark, I settled on the Overdrive derived overclocks as higher settings exhibited instability issues on my particular card. It is not my intention to portray the x1900xt as having a low overclock boundary but I am time and resource constrained putting all the comparisons together. In fact many people have achieved much greater overclocks on the platform as well! Note: I only utilized the stock HSF on the X1900XT as it is a rather beefy unit and is actually pretty quiet....when not utilized at full GPU capcity for an extensive amount of time though. BUT, when gameplay gets heavy and the card has been heat soaked in sufficiently, the blower emits a rather annoying moan at its higher settings akin to a leafblower, and gets rather HOT. I did not have an aftermarket HSF available, but there are very quiet solutions such as Artic Cooling's Accelero line that may also achieve better overclocks, although again many have achieve 700+MHz on the stock cooler.

System Setup:
Asus A8N32-SLI motherboard
AMD Opteron 165 @ 3.0GHz (Dual Core hotfix enabled)
2GB Corsair XMS 3200 2.5-3-3-6 @ 214MHz
Hitachi 500GB x 3 (RAID 0) SATA2
Windows XP Pro SP2
Asus 7900GT w/ NVIDIA 84.43 Drivers
ATI Radeon X1900XT w/ Catalyst 6.4 + 6.3 w/ "Chuck" patch
Oblivion Settings: All high settings (100% or quality enabled)


Test 1: Performance testing under usual gameplay

Run A (Outdoor Areas): Testing methodology will consist of having the main character ride around the outer road of the Imperial City all the way around to the southern bridge. This horse ride will go through heavy forest and grass areas, as well as by at least one Oblivion gate and several enemy characters. When the character arrives at the end bridge, he will fight from 1-3 foes including the highwayman on the bridge. Once completed the FRAP run will be stopped.

HDR w/ No AA or No AF - Performance setting:

ATI Stock 621MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
14599, 523496, 18, 66, 27.888

ATI OC 655MHz Core - 792MHz Mem
Frames, Time (ms), Min, Max, Avg
16561, 551838, 11, 63, 30.011

ATI OC 692MHz Core - 842MHz Mem (Soft Volt Moded)
Frames, Time (ms), Min, Max, Avg
16675, 522249, 12, 57, 31.929

NV 7900GT Stock OC 520MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
13144, 554991, 1, 46, 23.683

NV 7900GT VM OC 675MHz Core - 875MHz Mem
Frames, Time (ms), Min, Max, Avg
17245, 571843, 9, 60, 30.157

Assessment: With just HDR turned on in their respective performance quality settings, the overclocked x1900XT and 7900GT come to almost a dead heat in terms of framerates. You'll note how the stock OC 7900GT lags behind, but still has a somewhat respectable and playable avg. framerates. Even at stock settings, the X1900XT is still a good performer at such high resolutions!

HDR w/ No AA 8x AF (

ATI Stock 621MHz Core - 720MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
13968, 543034, 2, 57, 25.722

ATI OC 655MHz Core - 792MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
14778, 540610, 16, 58, 27.336

ATI OC 655MHz Core - 792MHz Mem (Standard AF)
Frames, Time (ms), Min, Max, Avg
14013, 501019, 13, 56, 27.969

NV 7900GT VM OC 675MHz Core - 875MHz Mem (Performance)
Frames, Time (ms), Min, Max, Avg
15783, 547224, 0, 58, 28.842

NV 7900GT VM OC 675MHz Core - 875MHz Mem (Quality)
Frames, Time (ms), Min, Max, Avg
13458, 529921, 9, 50, 25.396

NV 7900GT VM OC 675MHz Core - 875MHz Mem (High Quality)
Frames, Time (ms), Min, Max, Avg
11625, 530061, 2, 43, 21.931

Assessment: Although the volt modded 7900GT OC barely leads, you will note that I am utilizing ATI's High Quality Antistrophic Filtering which is a much better AF solution than NVIDIA?s. In fact, I found some IQ quirks with 8X AF enabled on the 7900GT (see Image Quality section for more details) that could only be minimized, and not eliminated when set to the Quality setting for NVIDIA. Doing this also cuts down on the average frame rate of the 7900GT below that of the X1900XT, especially with the High Quality setting. Seems strange since at best there seems to be only some minute differences in image quality from the 2 settings from what I can discern. I also tried with 4X HQAF settings on the X1900XT and there was no appreciable framerate loss between 4X and 8X HQAF modes, thus settled on 8X AF from here on. There seemed to be no discernable performance hit enabling HQ AF either.[/i]

HDR-4X AA 8x HQAF

ATI Stock 621MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
10539, 562581, 0, 36, 18.733

ATI OC 655MHz Core - 792MHz Mem
Frames, Time (ms), Min, Max, Avg
11492, 549567, 12, 45, 20.911

Assessment: One of the unique features of the X1000 line of ATI gpu's is the ability to do BOTH HDR and AA concurrently with the infamous "Chuck" patch that is unfortunately unsupported by both Bethesda Softworks and ATI, officially. As you can see above, it does incur a hit to framerates since the outdoors environment is loaded with textures, but playrate is still at a respectable ~21/19 FPS rate for the X1900XT. The only way to enable AF and AA concurrently on the 7900GT is to go to the older SM2.0 Bloom features, which is not as realistic (see pictures in Image Quality section below). I must note, at the higher resolutions like 1920x1080, even at 4X AA, it is not readily discernable with anti-aliasing on or off. But at lower resolutions like 1280x1020, it is almost a must with the complex and acute sceneries that Oblivion generates (see pictures in Image Quality section below).


Run B (Dungeon Areas): Testing methodology will consist of having the main character run and fight through the first Oblivion Gate dungeon area in Kvetch. Once the Sigil Stone is captured, the FRAP run will be stopped.

HDR-No AA 8x AF - OG

ATI Stock 621MHz Core - 720MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
3260, 91669, 7, 73, 35.563

ATI OC 655MHz Core - 792MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
4002, 99704, 17, 83, 40.139

ATI OC 655MHz Core - 792MHz Mem (Standard AF)
Frames, Time (ms), Min, Max, Avg
3652, 91289, 17, 85, 40.005

NV 7900GT VM OC 675MHz Core - 875MHz Mem (Performance)
Frames, Time (ms), Min, Max, Avg
4028, 95370, 18, 77, 42.236

NV 7900GT VM OC 675MHz Core - 875MHz Mem (Quality)
Frames, Time (ms), Min, Max, Avg
3865, 96102, 18, 71, 40.218

NV 7900GT VM OC 675MHz Core - 875MHz Mem (High Quality)
Frames, Time (ms), Min, Max, Avg
3697, 92707, 18, 70, 39.878

Assessment: Surprisingly, the volt modded overclocked 7900GTs actually take the crown when visual quality is not a concern. Again, it does take a framerate hit when the image quality is turned up. But the hit to High Quality setting isn't as apparent as in the outdoors and is almost neglible considering the margin of error strangely enough. The X1900XT scores are very good, especially in light of the fact HQ 8X AF is enabled. There seemed to be no discernable performance hit enabling HQ AF either.

Overall Performance Results

Our little David volt modded and overclocked 7900GT threw a pretty good slingshot at its Goliath competitor keeping up with the X1900XT in the two comparison Oblivion Outdoors and Dungeon runs. But while it was aiming for the X1900XTs head, it merely gave it a sucker punch shot in the gut, as it took an appreciable framerate hit with the image quality settings turned on to mimic ATI's excellent HQ 8X AF solution. Still it's not bad considering where its anemic stock performance is at, and with the affordable entry pricepoint, it's no wonder a top seller! As for the X1900XT, it IS ALL that and a can of worms: performance w/ great image quality to boot. Make no mistake about it, ATI does have a monster on their hands, with nVidia now having to play catch up to slay the red beast!




Addendum: I was finally able to overclock beyond the ATI Overdrive derived settings using ATI Tools. I had experienced frequent instability before; typically running OK on an overclock then crashing the system suddenly. But I finally found the culprit: an insufficient power supply! Although my trusty Silverstone 460W unit seemed capable never once flinching on the 7900GT even volt modded and overclocked, I noticed that during artifact testing with the ATI overclocks my 12V line would dip +250mV. Luckily for me, my local electronics store had a great deal on a Silverstone 600W modular power supply which I already picked up and had yet to install. After rectifying this, I was finally able to achieve a stable 692 Core / 842 Memory overclock, which I added in the initial RUN A results as a reference for everyone. With the immense power draw of the stock X1900XT, ATI does stipulate a minimum of 450W power supply, so with a 3.0GHz Dual Core Opteron CPU, 3 SATA HDDs, and the rest of my system running concurrently, I fathom I taxed out my old power supply. While this may be an isolated incident due to my system configuration, a word of caution goes out for those planning for even higher overclocks on the ATI platform; this baby is a powerful Chihuahua but sure guzzles electrons like a grown Doberman.


 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Test 2: Image quality comparisons


Anti-Aliasing (AA)

Simply put, the ATI X1900XT is the unequivocal winner here because of the 2 GPUs, ONLY the R580 VPU can process both HDR and AA concurrently. There is only one caveat though for Oblivion: it requires a fix deemed the "Chuck" patch ( http://support.ati.com/ics/support/defa...ID=894&task=knowledge&questionID=21960 ), but unfortunately as ATI states, "This driver is provided as a proof-of-concept and is not supported by Bethesda, 2K games or ATI Technologies.". Many people though, have good success with it. At 1920X1080 resolutions, the difference with HDR and even 6X AA isn't readily apparent unless you look for it specifically, but it still can be substantial when zoomed in closely for better scrutiny:

HDR w/ no AA (3X Zoomed): http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-16XAF-HQW.jpg

HDR w/ 6X AA (3X Zoomed): http://i59.photobucket.com/albums/g319/stranx44/ATIHDR6XAA-16XAF-HQW.jpg

Anistropic Filtering (AF)

ATI as well as NVIDIA has some decent implentations on their standard Anistropic Filtering as depicted below. One thing to note is that ATI and NVIDIA do have different different image quality modes for an even better visual experience. On the 7900GT, the difference between Performance and Quality setting is readily apparent as I noted some unusual image quality glitches with the Performance setting I did not readily see on ATI's AF solution. Stepping up to the highest High Quality setting though doesn't yield a discernable visual impact from Quality. This is one of the reasons I left the performance benchmarks to Quality settings on the 7900GT. The same holds true for ATI's High Quality (HQ) setting: I could not distinguish any visible improvements, even at angles, which ATI trumps. One thing though that really tips the scales in ATI's favor is the performance impact of these settings. NVIDIAs G71 GPU imposes a framerate loss when set to higher AF modes, with up to a 20% loss from lowest (Performance) to highest (High Quality) visual quality setting, but still has an acceptable FPS at the higher (Quality) setting. Probably because of the R580's advanced 48 pixel shader pipeline, there was hardly ANY discernable framerate difference between the various AF configurations at all!

ATI X1900XT AF Modes:

No AF: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-NoAF.jpg

4X AF: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-4XAF.jpg

8X AF: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-8XAF.jpg

16X AF: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-16XAF.jpg

16X AF No HQ: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-16XAF-Wall.jpg

16X AF HQ: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-16XAF-HQWall.jpg


NVIDIA 7900GT AF Modes:

No AF: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-NoAF-.jpg

4X AF: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-4XAF-.jpg

8X AF: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-8XAF-B.jpg

16X AF: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-16XAF-.jpg

16X AF Quality: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-16XAF-Q.jpg

16X AF High Quality: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-16XAF-HQ.jpg


Shimmering

Much has been made about the shimmering issues seen in today's GPU solutions. Fact is, BOTH NVIDIA and ATI drivers both exhibit shimmering, especially with any AF turned on. But it just so happens that, NVIDIA shimmers a bit more, so it can exhibit some strange image anamolies like the one seen below. Most of it can be rectified however by setting the image quality setting to Quality, but ATI is a little bit more robust still.

NV AF shimmer glitch 8X AF: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-8XAF-.jpg

NV AF shimmering glitch 8X AF Quality: http://i59.photobucket.com/albums/g319/stranx44/NVHDRNoAA-8XAF-Q.jpg

ATI AF shimmering glitch 8X AF HQ: http://i59.photobucket.com/albums/g319/stranx44/ATIHDRNoAA-8XHQAF.jpg


Overall Image Quality Results

It is readily apparent that NVIDIA has a lot of catch to do in terms of Image Quality. Although it has all the usual assorted AF goodies, the 7900GT drivers exhibit significant shimmering which can leave quite a visual blemish to some gameplay. And while this can be corrected, it imposes some performance loss in the process, and still doesn't have the visual apeal the X1900XT has with its HDR + AA options, as well as better more advanced AF modes all without any performance penalties. Our little David came in expecting to fight Goliath, but after putting on some glasses for better visual acuity, found out he was combating Godzilla instead!


Heat, Noise, Misc., and Final Thoughts

There are many other idiosyncrasies that one has to live with these two cards overclocked that can?t be overlooked.

First and foremost for team Green, presently there are no ?soft? modifications to get the required overvolt for these phenomenal overclocks. It does require some due diligence in a hardware rework, albeit if you look at the instruction depicted in the links above, it is rather simplistic since essentially you are drawing a line from point A to point B. I will note however that it may seem crude compared to X1900XT?s more configurable software voltage modification, but even that requires some fiddling around with other default ATI software resources that were designed to protect the GPU. Either variation on overclocking will have its nuances, but we will give the edge here to ATI.

Cooling such a little the overclocked and volt-modded should be sufficient on the stock heatsink fan, especially if you are only bumping up the core voltage to only 1.3V-1.4V from stock and clocking to the 625-650MHz arena. Remember, there are actually 2 sets of default HSFs for the reference 7900GT boards: one that only covers the GPU itself for the ?base? model and the other one that covers both the GPU and the mem in the ?factory overclocked? versions. If you push your voltages and in turn your overclocks even higher, I would highly recommend going to an aftermarket solution. Which leads me into team Red?s hsf solution, a very beefy unit that almost covers the whole video card itself. It is very similar to the older Artic Cooling Silencer line exhausting the air direct outside in the adjacent slot. One noticeable difference though, is that it is a pull air design, meaning it sucks air across the heatsinks, instead of blowing on them. While the design is still practical, I suspect this is one of the main reasons why it requires a higher rpm, thus lending it to generate more noise.

Speaking of noise, there has been much commotion on both stock solutions, I did note two distinct characteristics on them: 1) the stock 7900GT is a rather paltry fan thus prone to higher frequency oscillation, almost a whine per se. Since there is no rpm control for it, the sound is pretty constant. 2) for the X1900XT, at relatively moderate usage, the hsf is rather quiet with only slight hush of air audible. Since this is a variable speed controlled fan, once it is loaded up and heat soaked in though, the noise becomes much more noticeable, until finally going to full speed, where I would liken the sound to that of a leafblower; it is loud! Fortunately, I only found that scenario at high overclocks, but it still exhibits some droning even at lower speeds. It is virtually a draw, but I would give the edge to ATI barely.

While the 48 pixel shader architecture of the R580 proves to be a phenomenal performer, ATI?s implementation designates a much higher transistor count and in turn much more higher power draw even with a 90nm process used for the VPU. I noted max temps in the range of 89C during the course of heavy usage on my overclocked X1900XT. This caused the stock HSF to kick into high fan mode with its unbearable noise. Where there is high heat dissipation, there is also high power consumption! As noted above in the addendum, I had some power supply issues with my present 460W unit that was prohibiting me from high overclocks. This was only rectified after going to a 600W unit. The X1900XT is THAT power hungry! On the other hand, nVidia chose to go with a more conservative approach to their design, more evolutionary than revolutionary (unlike the R580). The 90nm G71 is essentially is a die shrunk G70 core, albeit with some slight tweaks in cache branch predictions and other undocumented changes for performance improvements. Where it really shines is in its lower power consumption. In stock form, it is so frugal, HTPC / Silent enthusiasts will delight to hear that even passive HSFs will be available to cool it (AC?s upcoming S1 line). At overclocked and even volt modded speeds, the power consumption does go up but not at the rate to which it will require a 500W+ PSU (note: that the comparable 7900GTX platform calls for minimum of 400W unit only). And while it is not a necessarily a fair comparison, the 7900GT used in testing reached a maximum of 62C with an aftermarket HSF though. Previous testing I have done with the stock HSF with a volt modded overclocked 7900GT shows it ranging up to the 69C range. Without a doubt, the clear winner in the heat and power consumption department is the 7900GT.

During the course of these tests, I had a great newfound respect for the ATI X1900XT. It could both perform AND look good at the same time without any penalties thereof. But for a top of the line nearly $500 card, even overclocked, I still wanted more: better and quieter stock HSF (and a single slot solution would be candy, although almost improbably), less power consumption and thus heat, and even more performance to distance itself while we?re asking Santa. The volt modded overclocked 7900GT was sort of a revelation : it could overclock quite well beyond most folks wildest expectations, but in terms of the image quality, I was left with a somewhat bitter taste. The only trump cards really in its favor is the reasonable entry price of ~$300 and its lower power consumption. In the end, these two cards were never really destined to compete with one another and the features and idiosyncracies show it. The true killer of both cards, might actually be the forthcoming X1900GT, which if unlockable and can be highly overclocklable to near XT speeds, would be the veritible champ in value. But that...is for another review!

 

Alaa

Senior member
Apr 26, 2005
839
8
81
gr8 results, gr8 work.i was amazed by those numbers really.7900gt VM perform like the x1900xt,wow.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Not bad, but I can see ALOT of inconsistencies there, so it's not quite accurate to the frame.
 

Todd33

Diamond Member
Oct 16, 2003
7,842
2
81
Sounds like the OP had an agenda and proved it. Not exactly scientific.
 

Elfear

Diamond Member
May 30, 2004
7,115
690
126
I've talked with ST and I'll be posting some results with my card once I actually get the game installed this week. The results will be just to show what an oced X1900XT will do. I'll try and follow his path as best as possible so that results are comparable.

Thanks for the work you put into this ST.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Todd33
Sounds like the OP had an agenda and proved it. Not exactly scientific.

how's that?

coming from someone (me) who has butted heads with ST quite often, i'd have to say he's done a pretty decent job (so far) of investigating this scenario from a fairly objective point of view.

we all knew (well, those who were reasonably objective) a GT overclocked to the extreme such as his would perform well (however i have yet to see his oc results are anything but atypical), and that the XT would still offer performance with the advantage of some IQ enhancements not available to the GT.

while he's provided a what i consider a well-written post on his findings, the results have so far been what many people here expected them to be.
 

golem

Senior member
Oct 6, 2000
838
3
76
Agreed. If you see a problem with what ST has done so far, or with his results then spell it out. Don't just post some vague blurb about inconsistancies or agenda and leave it at that.

Originally posted by: CaiNaM
Originally posted by: Todd33
Sounds like the OP had an agenda and proved it. Not exactly scientific.

how's that?

coming from someone (me) who has butted heads with ST quite often, i'd have to say he's done a pretty decent job (so far) of investigating this scenario from a fairly objective point of view.

we all knew (well, those who were reasonably objective) a GT overclocked to the extreme such as his would perform well (however i have yet to see his oc results are anything but atypical), and that the XT would still offer performance with the advantage of some IQ enhancements not available to the GT.

while he's provided a what i consider a well-written post on his findings, the results have so far been what many people here expected them to be.

 

morgash

Golden Member
Nov 24, 2005
1,234
0
0
thirded on that. i find his results to be fairly accurate and close to what i was expecting. i WAS expecting the 512 vs 256mb o RAM issue to show up in high res with all the eye candy. it amazes me that all that data managed to fit into the 7900gt without any texture thrashing. i run mine at 625/1950 and i still get the occasional pause as the GPU grabs all the data in the RAM and processes it. now i really want to see a x1800xt benched in exactly the same way stock and OC so we can put that issue to rest as well.

Morgash
 

Sc4freak

Guest
Oct 22, 2004
953
0
0
I'm completely unfamiliar with Nvidia's control panel settings, but itsn't there a "High Quality" setting as well, in addition to "Quality"?

If so, what's the performance hit by turning that on?
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
Originally posted by: Sc4freak
I'm completely unfamiliar with Nvidia's control panel settings, but itsn't there a "High Quality" setting as well, in addition to "Quality"?

If so, what's the performance hit by turning that on?


Yes there is and I'm wondering as well why he didn't use it if this is an apples-apples review. HQ AF takes a noticeable hit on G70/G71 based cores from my experience. Once he's done with the review, there should be a breakdown on what it would cost someone to go from a stock 7900 GT to a fully modded one with the assumption that they do not have any of the supplies on hand. From what I calculated in other posts, it drives the cost up by about $100 (includes an aftermarket cooler which is required if you're going to run at 1.5v+).
 
Jun 14, 2003
10,442
0
0
Originally posted by: Extelleron
Not bad, but I can see ALOT of inconsistencies there, so it's not quite accurate to the frame.

thought there might of been

cant have anything nvidia coming close to ATI right?

what if there were tons of inconsistencies, but ATI was shown to be dominating completely? i dont think you would be saying what you just said if that was the case
 

thepd7

Diamond Member
Jan 2, 2005
9,429
0
0
Originally posted by: otispunkmeyer
Originally posted by: Extelleron
Not bad, but I can see ALOT of inconsistencies there, so it's not quite accurate to the frame.

thought there might of been

cant have anything nvidia coming close to ATI right?

what if there were tons of inconsistencies, but ATI was shown to be dominating completely? i dont think you would be saying what you just said if that was the case

Obvously your an nvidia fanboy since he is obviously an ATI one and he didn't say a single thing about it. Since you disagre with him you are nvidia. And since I think you are an idiot for making ASSumptions I am obviously and ATI fanboy. This is all quite clear even if you never mention ATI, nvidia, or anything else.
 

Sable

Golden Member
Jan 7, 2006
1,127
99
91
Originally posted by: Todd33
Sounds like the OP had an agenda and proved it. Not exactly scientific.

Eh? I see no problem at all with that reivew. Speeds are similar, ATI looks a bit better.

Only question I have isabout the image quality test.

Was "quality" or "high quality" used in the nv control panel? As has been mentioned this would cause a greater performance hit and the use of HQ on the ATI would indeed make this an unfair comparison.

 

NoDamage

Member
Oct 7, 2000
65
0
0
ATI Stock 621MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
14599, 523496, 18, 66, 27.888

ATI OC 655MHz Core - 792MHz Mem
Frames, Time (ms), Min, Max, Avg
16561, 551838, 11, 63, 30.011
NV 7900GT VM OC 675MHz Core - 875MHz Mem (Performance)
Frames, Time (ms), Min, Max, Avg
15783, 547224, 0, 58, 28.842

NV 7900GT VM OC 675MHz Core - 875MHz Mem (Quality)
Frames, Time (ms), Min, Max, Avg
13458, 529921, 9, 50, 25.396
The min. fps numbers here seem a bit inconsistent to me. On the X1900XT you see a significantly lower min. fps on the higher clocked part, and on the 7900GT you see a lower min. fps on the performance setting. This is probably due to inconsistent runs for each benchmark (especially if you are doing combat, then the number of opponents on the screen at a given time will affect framerate).
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Originally posted by: 5150Joker
Originally posted by: Sc4freak
I'm completely unfamiliar with Nvidia's control panel settings, but itsn't there a "High Quality" setting as well, in addition to "Quality"?

If so, what's the performance hit by turning that on?


Yes there is and I'm wondering as well why he didn't use it if this is an apples-apples review. HQ AF takes a noticeable hit on G70/G71 based cores from my experience. Once he's done with the review, there should be a breakdown on what it would cost someone to go from a stock 7900 GT to a fully modded one with the assumption that they do not have any of the supplies on hand. From what I calculated in other posts, it drives the cost up by about $100 (includes an aftermarket cooler which is required if you're going to run at 1.5v+).

almost everyone will recommend aftermarket cooler for the XT aswell
 

Elfear

Diamond Member
May 30, 2004
7,115
690
126
Originally posted by: Alaa

almost everyone will recommend aftermarket cooler for the XT aswell

For overclocking yes, but for the speeds ST ran his at the stock cooler is fine. It is loud at full speed but that is hardly ever attained even during hard gaming. Might be different now that summer is coming but most of the reader reports I've seen say the card isn't that loud during actual usage.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
The one point i'll take issue with is that the review is not written with any hint of impartiallity. The 7900GT wins or draws with the more expensive X1900XT offering, but the ATI card does allow you to use the AA+HDR feature. In short the numbers are being twisted to show the ATI card to have more of an advantage than is really there.

Summary as it should be: The voltmodded 7900GT performs as well as the X1900XT overclocked using software, but it does not allow for the use of AA+HDR. The price difference is not negligable and some will argue that the performance hit caused by the AA + HDR makes the game unplayable. There is issues with replacement heatsinks (the 7900GT needs one and the X1900XT benifits greatly from one) and the ease of overclocking the X1900XT, making this choice harder for a potential buyer.

Edit: This should be taken as constructive criticism, you're doing something that can be refered to for settling some of the more tedious arguements around here, but it lessens the impact of it to nothing if your numbers don't support your statements and right now they dont'
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: otispunkmeyer
Originally posted by: Extelleron
Not bad, but I can see ALOT of inconsistencies there, so it's not quite accurate to the frame.

thought there might of been

cant have anything nvidia coming close to ATI right?

what if there were tons of inconsistencies, but ATI was shown to be dominating completely? i dont think you would be saying what you just said if that was the case

If you look at some of the results, you'll see the nVidia 7900GT overclocked w/ voltage sometimes BEHIND the regular OC'd one..... an inconsistency. In the end, this benchmark is not 100% vaild because of the fact that SOMETIMES you do things slightly differently/ run in a different direction, swing sword at a different time, etc. There is no way to ensure 100% validity. Therefore this benchmark should be considered, at the very least, to have a margin of error around 5-6 FPS.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Extelleron

If you look at some of the results, you'll see the nVidia 7900GT overclocked w/ voltage sometimes BEHIND the regular OC'd one..... an inconsistency. In the end, this benchmark is not 100% vaild because of the fact that SOMETIMES you do things slightly differently/ run in a different direction, swing sword at a different time, etc. There is no way to ensure 100% validity. Therefore this benchmark should be considered, at the very least, to have a margin of error around 5-6 FPS.

"Note: Because I am doing the frapping in actual gameplay, there will be some variances in the tests. I would imagine a margin of error of ~5% for the scores due to the random nature executing each run in a real live game environment. This philosophy in testing follows Anandtech's own GPU comparion in Oblivion ( http://www.anandtech.com/video/showdoc.aspx?i=2746 ). Many of the tests were double, even triple checked to ensure accuracy. What should matter most to folks is the not necessarily the maximum framerate, but rather the average, and to some certain extent, the minium, as it will be more indicative of actual gameplay."

 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: ST
Originally posted by: Extelleron

If you look at some of the results, you'll see the nVidia 7900GT overclocked w/ voltage sometimes BEHIND the regular OC'd one..... an inconsistency. In the end, this benchmark is not 100% vaild because of the fact that SOMETIMES you do things slightly differently/ run in a different direction, swing sword at a different time, etc. There is no way to ensure 100% validity. Therefore this benchmark should be considered, at the very least, to have a margin of error around 5-6 FPS.

"Note: Because I am doing the frapping in actual gameplay, there will be some variances in the tests. I would imagine a margin of error of ~5% for the scores due to the random nature executing each run in a real live game environment. This philosophy in testing follows Anandtech's own GPU comparion in Oblivion ( http://www.anandtech.com/video/showdoc.aspx?i=2746 ). Many of the tests were double, even triple checked to ensure accuracy. What should matter most to folks is the not necessarily the maximum framerate, but rather the average, and to some certain extent, the minium, as it will be more indicative of actual gameplay."
:thumbsup:

 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
People on the Xtremesystems forums have gotten much higher OC's from their x1900's. THe x1800's used to be able to hit up to 730 on the core.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |