PlatinumGold
Lifer
- Aug 11, 2000
- 23,168
- 0
- 71
how ironic. if i read the reviews correctly, Nvidia has the slightly better hardware, ATI the better fit and finish.
i really hate to burst your bubble, but FarCry ONLY ues PS3 for h20 effects . . . by the time the GF makes a diff in games, you will have upgraded to NV45/50 or r500.Originally posted by: Shad0hawK
Originally posted by: GTaudiophile
BTW, after watching that Ruby demo movie...
Nalu's flipper is toast!
i just watched it, it did not impress me much after seeing individual strands of hair move in water(plus the ATI demo was very cheesy)
"we will meet again my dear ruby!"
*overly dramatic music plays*
ROFL!!!
i guess i will be playing farcry, stalker, battle for middle earth...etc. in PS3 with superior framerates and image quality while the ATI fanboys console themselves with the fact they only have to use one molex connector...
Originally posted by: Wolfsraider
1 RADEON? X800 PRO Graphics Board
1 Ruby T-Shirt - Size Large
1 RADEON? X800 Mouse Pad
1 RADEON? X800 Poster
$449 USD
from here
was getting a new driver ...follow the pop up and you will see the offer
mike
edit link doesn't work
Our driver team has been busy with stability, bugs and performance tweaks as we get closer to the ship dates for our OEM partners and add-in card partners. Areas of interest compared to 60.72 driver include:
5% boost in Aquamark
5% boost in X2 performance
5% boost in Tomb Raider: Angel of Darkness performance
10% boost to Call of Duty performance
50% improvements in Far Cry performance
First of all, it ain't over . . .Originally posted by: GTaudiophile
Let's not forget the X800 chips are made with Low-K while the 6800 chips are not. I have the feeling the X800 chips will scale better, much better than nVidia would like to admit.
And it's funny to me, after seeing how much trouble nVidia had with .13mu, how ATi has taken off with it, using more advanced technology.
[nVidia] would make a response to ATI's "Platinum" X800 XT.
According to the site, Neuschäfer said Nvidia's partners will create cards to counter the ATI strike, and so it looks like the graphics wars are getting ever more intense.
Various suggestions have been put forward by exceptionally well informed individuals, but more than one has basically suggested that 'yields suck at IBM' and that 'maybe other (IBM) customers are having the same issues'. We've reported on yields problems at IBM in previous weeks.
Following the launch of the ill-fated GeForce FX 5800 Ultra (NV30) we seem to recall some public comments by Nvidia, and definitely from one of its partners which, referred to process problems at TSMC, and which essentially seemed to lay significant blame at TSMC's feet for the failure of NV30.
Subsequently when Nvidia started to canoodle with IBM, we heard lots of loving noises from Nvidia PR regarding IBM. TSMC might not be too happy with Nvidia - certainly ATI is now one of its very valued customers.
NVIDIA'S mid-range GeForce FX 5700 Ultra (NV36) has had a bit of a rocky history. It was first introduced with GDDR2 video memory but as stocks of GDDR2 dwindled, it seems Nvidia was more or less forced to re-equip 5700 Ultra with the new, initially more expensive, GDDR3.
The background to this seems to be that Samsung had already been burnt somewhat over its GDDR2 business with Nvidia. This was because when Nvidia's first GDDR2 based product, GeForce FX 5800 Ultra (NV30), failed to be a commercial success, Samsung was stuck with the GDDR2 it had specifically created.
This was an opportunity that ATI rapidly capitalised upon, and it subsequently bought a significant amount of GDDR2 from Samsung for its 256MB Radeon 9800 PRO. Apparently the decision was a bit of a no-brainer, as it seems the price paid by ATI for these GDDR2 modules wasn't dissimilar to the price it was already paying for the slower DDR1 modules, with which it populated its 128MB variants of Radeon 9800 PRO.
So, we understand, having been burnt once, and with the imminent arrival of GDDR3 into the market, Samsung declined to produce any more GDDR2 modules for Nvidia, leaving the graphics firm with one of two options: either go with GDDR3 or drop back to the slower DDR1.
This worked out perfectly for ATI. Not only because of the aggressive price it paid for a significant amount of high-performance GDDR2, but because in adopting GDDR3 early, Nvidia was effectively dropping the cost of this new technology, and starting to create volume for Samsung.
This was and is an important factor for ATI as next week, like Nvidia GeForce 57000 Ultra and GeForce 6800 Series, ATI will introduce it's own GDDR3 equipped 3D accelerators, the Radeon X800 Series.
Nvidia being the first to adopt GDDR3, probably reduced 5700 Ultra margins in the already cut-throat mid-range consumer graphics sector.
But there's an ironic twist to this tale.
TSMC favoured Nvidia and thought ATI's proposals were 'like the tail trying to wag the dog', the thinking way back being that Nvidia was the top dog.
So Samsung was producing GDDR2 for Nvidia's GeForce FX 5800 Ultra (NV30), but then having got burned by NV30's lack of success ? and after being quite negatively vocal about ATI for making GDDR3, it's very ironic that Samsung are now pushing GDDR3 so heavily
Originally posted by: apoppin
Finally working ati promo link . . .
(i got it 2)
Yep, a HL -2 coupon.
:roll:
I'm waitin' for the xt-PE/6850u editions
*EDIT: $15 shipping - "ground" . . . here is where ati lacks 'class'.
:roll:
Those ati cheapskates are charging $450.00 for their no.2 or 3 card then have the audacity to charge $15 for a GROUND (5-7 days) shipping. That's immoral . . . "class" would indicate either including the shipping charge (ala NewEgg) or at the LEAST offering OVERNIGHT . . .Originally posted by: shady06
Originally posted by: apoppin
Finally working ati promo link . . .
(i got it 2)
Yep, a HL -2 coupon.
:roll:
I'm waitin' for the xt-PE/6850u editions
*EDIT: $15 shipping - "ground" . . . here is where ati lacks 'class'.
:roll:
if they can get suckers to pay for it, why not? i dont see it as a lack of class at all, just good business
Maintaining Gainward's tradition of product names to confuse a cunning linguist, this flagship product is identified as the CoolFX PowerPack! Ultra/2600 "Golden Sample".
With today's launch of ATi's X800 Series graphics cards, Nvidia's own attempt to dampen down the positive press ATi is currently basking in, has been a mild 50MHz kneejerk'o'clock of the original 400MHz NV40 reference boards, but memory speeds remain the same at 550MHz.
Not content with this, Gainward has hand carved a lot of the fastest 450MHz '6800 Ultra XTreme' GPU's, slipped on a trick Innovatek water block, and hope to tweak the core of these up to 150MHz higher than other GeForce 6800 Ultras.
The Gainward CoolFX Ultra/2600 will feature 256MB of full spec 600MHz GDDR3 memory which Gainward anticipate will clock beyond its rated 1200MHz DDR. All this willy waving tweakery is expected to yield further performance increases vertical of 20%.
Based upon the numbers we've seen on even the vanilla Nvidia GeForce 6800 Ultra reference samples, we think the performance potential of the CoolFX Ultra/2600 ?Golden Sample? is unlikely to disappoint. Though owning the fastest production 3D graphics accelerator bar none won't cost pennies ? in fact it'll cost in the region of a CoolFX £599, so it's sure to be as exclusive as it is rapid.
That said, CoolFX Ultra/2600 does come complete with genuine Innovatek water cooling components including a high performance heat-exchanger, low noise 120mm fan and, say Gainward, "special plastic pipes and fittings to connect all components into a sealed water circulation system".
If you must have all this high performance in a wholly silent system, then CoolFX can be extended with the Gainward CoolPC upgrade kit, which includes AMD or Intel processor and core logic water block, plus all ancillaries, and coming in at around £100 inc VAT doesn't seem to bad.
As we also reported earlier, Gainward will also introduce a somewhat less expensive advanced air-cooled variant, the PowerPack! Ultra/2600 "Golden Sample". This is another 256MB GDDR3 board based upon the 16x pipeline GeForce 6800 Ultra, but is more conservatively clocked at 430MHz core and 1150Mhz memory. Notwithstanding these lower base frequencies, the performance of PowerPack! Ultra/2600 is still likely to benefit from whatever Enhanced Settings, Gainward's EXPERTool software tweaking utility will allow.
Sure, but it isn't necessary or a much higher o/c than air-cooling - unlike the nVidia GPU.Originally posted by: LTC8K6
Well, the X800XT can be water cooled too, can't it?
Being a lot cooler in the first place.......
Originally posted by: RussianSensation
Actually, crytek later released an explanation saying the comparison was between PS3.0/2.0 and PS1.1
The fact that Nvidia explained in many interviews that PS3.0 does NOT offer image quality enhancements over PS2.0 doesn't seem to sway your opinion. PS3.0 is supposed to make things faster, which in reality suggests that you COULD write longer instruction sets and with branching allowing for deeper shaders and thus better image quality. But if you wanted to you could write the same code using PS2.0 it would just take longer. But if ATI cards already work faster in PS2.0 there is no need for them at this point to utilize PS3.0. Logically if Nvidia works slower in PS2.0, it would only make sense for it to work SLOWER in a LONGER instruction set of PS3.0. PS2.0 can do everythign PS3.0 can so image quality is IDENTICAL, just the way to get the same image is supposed to be optimized in 3.0; but that hasn't been proven yet with any game. Now nvidia is counting on running their games in less loops using PS3.0. I am not saying ATI is better and be done with it, but more games have to be compared that utilize PS3.0 to really decide if that feature is important or not.
As it stands, X800xt is the clear performance winner if you disregard price. Comparison of X800Pro to GT is another story which must be more carefully analyzed.
There is no point of arguying as the ppl who made up their minds on buying ATI or Nvidia will never change their view and then there remain the rest of us who want to wait for prices to fight over and see what the best card to buy is.
SM3.0 & Displacement Mapping
One of the more major upgrades in Shader Model 3.0 is the addition of Vertex Texture Lookups. What this allows is features like Displacement Mapping. If there is going to be any major difference in image quality comparing Shader Model 3.0 to 2.0 it is going to be with the use of Displacement Mapping. Bump mapping which is currently used now to give the appearance of height in textures is just an illusion. There is no physical difference in the texture, meaning if you look at the texture from the side or dead on you will see that it is still flat, only from far away does bump mapping work. Even then it isn?t the best option since the texture is physically still flat light and shadows do not reflect correctly. The answer is Displacement Mapping which physically adds surface detail by manipulating the height of the texture. Displacement Mapping can even go as far as to create the model itself. Displacement Mapping may be a huge boon to adding realism in games. If developers pick up on this technology and we see it implemented in games, this right here could be the deciding feature that shows the most difference between a game rendered in Shader Model 3.0 and a game rendered in Shader Model 2.0.
Shader Lengths & Passes
First let?s look at the shader length in the Pixel and Vertex Shader. In Pixel Shader 2.0 the shader length can be up to 96 instructions long. With Pixel Shader 3.0 that instruction count has been increased to 65,535 and in the GeForce 6800Ultra the instruction count is unlimited. With a longer instruction limit more effects can be applied per pixel in a game. However, writing a shader program with a very long instruction count is not the only way to achieve an effect that would require such a long shader. Long shaders can also be achieved by using multipassing. For example you could have several 96 instruction length shaders in Pixel Shader 2.0 that perform an effect by doing them in several passes, thus achieving the same effect that one very long shader program outputs. In fact the Radeon 9800 series has built within it an F-Buffer that is meant to aid shader multipassing.
One of the most anticipated games this year Half Life 2 uses Shader Model 2.0 extensively but only has a shader length of 30-40 instructions, not even coming close the 96 instruction limit in Pixel Shader 2.0. So concerning shader length, what we are left with is going to be the performance differences between running Shader Model 2.0 programs in many passes with multipassing versus running Shader Model 3.0 with very long shaders. As of right now we have no idea how well the GeForce 6 series video cards can run very long shaders, and we also have no idea how well the Radeon series can run Pixel Shader 2.0 instructions with many passes
I believe you are correct. That would explain why they don't look the same.Originally posted by: SmokeRngs
Do not quote me on this and I don't have time to check it out right now, but I thought the nVidia screenshots were from a demo and not the actual game while the screenshots HardOCP did were from the actual game. If that is the case, it would explain the reason why the statues are not the same.Originally posted by: NOX
These are: http://www.pcper.com/article.php?aid=36.Originally posted by: LTC8K6
Are those the shots that are actually 1.1 compared to 3.0?
But the ones Shadowhawk linked to from HardOCP are 2.0 to whatever Nvidia is doing. Not to mention the HardOCP statue is not the same as the original screenshot. Not sure why, maybe they couldn't find the same one or they figure it was just for the sake of comparison.
What about the Gainward CoolFX Ultra/2600?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Are those the proper prices?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Originally posted by: BoomAM
Are those the proper prices?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Its criminal that nVidia are charging $100 more for 50Mhz extra on just the core!
As ive said before, the GT is the interesting card of the nVidia line-up. At first they`ll probably be using 12 pipe silicon that can only hit 350, but after a few months, when the process has been refined, then i`ll think we`ll see GTs with cores that can OC ALOT more.
Originally posted by: ed21x
Originally posted by: BoomAM
Are those the proper prices?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Its criminal that nVidia are charging $100 more for 50Mhz extra on just the core!
As ive said before, the GT is the interesting card of the nVidia line-up. At first they`ll probably be using 12 pipe silicon that can only hit 350, but after a few months, when the process has been refined, then i`ll think we`ll see GTs with cores that can OC ALOT more.
I'm willing to bet the release of the Ultra Extreme was more or less a marketing ploy like the p4EE, just so the company can claim the top spot. I'm fairly certain they don't expect to sell that many of those cards, and probably only be making any money off of it through vendors like Alienware aod VoodooPC.
Originally posted by: apoppin
What about the Gainward CoolFX Ultra/2600?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
It's 256MB full spec 600MHz GDDR3 memory will clock beyond 1200MHz DDR and a core clock ~600 MHz.
EDIT: WHEN? Ati's shipping pro now and xt b4 end of the month; 6800 end of month . . . more limited then i think they like.
(no discounts)
I`d throw the salesman out of the building for trying to sell me that!.Originally posted by: Nebor
Originally posted by: apoppin
What about the Gainward CoolFX Ultra/2600?Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599
GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499
GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399
GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299
If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
It's 256MB full spec 600MHz GDDR3 memory will clock beyond 1200MHz DDR and a core clock ~600 MHz.
EDIT: WHEN? Ati's shipping pro now and xt b4 end of the month; 6800 end of month . . . more limited then i think they like.
(no discounts)
I'll throw down $600 for the Gainward version.