actual G70 card at Hardwarezone

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Humey:
no way is a new G70 flagship card gonna have 5mhz core over a older 6800ultra

Seems to me that new cores are often slower than older cores. An fx5950U was clocked at 475Mhz, much faster clocks than any 6800. Heck a 5800U was 500mhz. 9600XT's were clocked at 500mhz also.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It sounds like you may have a faulty card then seeing as thousands of other people don't appear to have any problems with heat on that card.

According to ATi the card is just fine- it is operating within specs. They told me if the card heats up enough to be unstable then I must not have enough cooling in my case(XaserIII V1000A- guess they must think that watercooling or phase change is a standard part of PCs).

Have you tried cleaning the HSF?

It has done this since it was new but to answer your question yes I have- I'm extremely careful with dust in my rig and clean my air filters(one for every intake fan) regularly also.

Do you really think ATi is that stupid that they would continually ship a cooler that doesn't cool the card enough?

Stupid, hmmm, depends on how you look at it. They got my money, and then they can tell me to go to he!l- I was the stupid one thinking that they sold a reliable product(been through it multiple times before, I knew better). I have been informed by ATi that they offer no warranty that any game will ever work on their hardware either- although that took an awful lot of ripping them about Sacrifice before they would say anything at all. I purchased a BBA R9800Pro and can honestly say that I received significantly more support from nVidia and 3Dfx when I purchased boards that simply used some of their parts- which isn't really saying much at all(didn't much care for 3dfx).

Do I think ATi would sell a product they knew did not function properly- yes- and they have proven it multiple times. You want one obvious one check out the box for the RageFury Maxx and look at OSs supported. ATi will lie- well beyond marketing speak- flat out lie to get people's money.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: BenSkywalker

According to ATi the card is just fine- it is operating within specs. They told me if the card heats up enough to be unstable then I must not have enough cooling in my case(XaserIII V1000A- guess they must think that watercooling or phase change is a standard part of PCs).
Have you tried replacing the thermal compound on the heatsink with some artic-silver? The standard thermal compound on those cards isn?t that great. The factory application of thermal compound varies and maybe you got one that wasn't so good.

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Do I think ATi would sell a product they knew did not function properly- yes- and they have proven it multiple times.

Well, I guess the thousands of perfectly functioning R9800Pro cards out there are just flukes? It's inconceivable you may have just gotten a bad card?

You want one obvious one check out the box for the RageFury Maxx and look at OSs supported.

The MAXX card doesn't work in Win2K because Win2K doesn't support multiple GPUs on a single graphics device very well at all (it's apparently next to impossible without special hardware support on the card). Considering that Win2K wasn't out when they released the card originally, I find it difficult to blame ATI for this one. Driver support for future OSes is rarely a guarantee with any kind of hardware.

ATi will lie- well beyond marketing speak- flat out lie to get people's money.

While I can't defend every move ever made by ATI, their competitiors hardly have a spotless record in this regard either. NVIDIA's blatant cheating with 3DMark03 (hardcoding their drivers around the exact benchmarks to boost their scores), and their highly questionable marketing on the PVP chip (revising the feature set six months after release), for instance?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Have you tried replacing the thermal compound on the heatsink with some artic-silver? The standard thermal compound on those cards isn?t that great. The factory application of thermal compound varies and maybe you got one that wasn't so good.

No, which I should have done. Actually, I was planning on replacing the stock cooler with an aftermarket unit and using some AS but with the problems I had with the board I decided not to modify it at all in case it completely died on me. Normally I void the warranty OCing and don't care, but there was no way I was going to push the core I got(one of the earlier extremely hot cores obviously) so I opted to keep it completely stock. The board would have been replaced long ago at this point if not for the fact that there isn't much motivation to upgrade at this point. D3 and HL2 were enormous let downs for me leaving FarCry as the only real GPU strainer I have played and enjoyed for some time. As of now I'll wait until the next gen parts hit- the G70 at least should be good for a long time with ports crossing over from the consoles(the R500 would be good too, if ATi didn't decide to release the dumbed down R520 instead).

Edit-

Well, I guess the thousands of perfectly functioning R9800Pro cards out there are just flukes?

Define perfectly functioning first. I heard about the incredible perfectly functioning Rage128 parts, the perfctly functioning R100 parts, the perfectly functioning R200 parts and the perfectly functioning at launch R300 parts all of which were laughable at best. You want to see something broken with a R9800Pro fire up Sacrifice using any driver revision. The game will not render properly. ATi has known about it for years too.

Considering that Win2K wasn't out when they released the card originally, I find it difficult to blame ATI for this one.

They never dropped their claim for Win2K support from their packaging. They had their hands on builds of Win2K long before the boards shipped- they knew exactly what they were up against using the NT 5.0 build and trying to use AFR via multipler rasterizers via AGP.

NVIDIA's blatant cheating with 3DMark03 (hardcoding their drivers around the exact benchmarks to boost their scores), and their highly questionable marketing on the PVP chip (revising the feature set six months after release), for instance?

Quack, ZD's old 3DBench too- I never went off on ATi back then about those(except to chuckle at how horrible the IQ was in Q3 while everyone claimed how great it looked) and you can feel free to check the archives to verify that(I can't edit them now). ATi was cheating before nV had a viable PC part(detecting single buffering and dropping frames completely from being rendered) and they cheated with Quake3- those I consider very nasty marketing issues. For that matter, I currently think of any AF bench from either ATi or nVidia to be 'cheating' as they are both using considerable short cuts which negatively effect IQ instead of rendering it properly- but again that is a marketing type thing. Nothing like promising a high end part will function with an OS you know for a fact won't.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BenSkywalker
Well, I guess the thousands of perfectly functioning R9800Pro cards out there are just flukes?

Define perfectly functioning first.

Um, working the way it is supposed to (sort of like mine does). If there was a widespread problem with 9800Pro cooling, don't you think it would have maybe shown up in more than a handful of cases? You'd think that the boards here would have been inundated with posts from people with defective cards.

I heard about the incredible perfectly functioning Rage128 parts, the perfctly functioning R100 parts, the perfectly functioning R200 parts and the perfectly functioning at launch R300 parts all of which were laughable at best.

I never personally used an R100 or R200, but I knew several people with R95XX/R97XX cards that didn't have any problems with them. The only major thing I recall is that some early cards were not happy on certain motherboards with AGP8X and/or Fast Writes enabled. Annoying? Yes. Ruinous? Hardly.

You want to see something broken with a R9800Pro fire up Sacrifice using any driver revision. The game will not render properly. ATi has known about it for years too.

How long are you going to harp on this one game? That's how many years old?

And have you considered that maybe the bug is not their fault? Or not easily fixable without changes to the game? Has the developer said anything about the problem?

Considering that Win2K wasn't out when they released the card originally, I find it difficult to blame ATI for this one.

They never dropped their claim for Win2K support from their packaging.

I'm sure they had every intention of supporting it in the first place (although I do think they should have announced sooner that they would not be supporting it -- they didn't finally decide that until three months after Win2K was out). Did you want them to recall all the old boards so they could change the boxes or something?

They had their hands on builds of Win2K long before the boards shipped- they knew exactly what they were up against using the NT 5.0 build and trying to use AFR via multipler rasterizers via AGP.

The RAGE MAXX launched in December of '99, meaning they probably had the hardware more or less set in stone 6 months before that (and the basic design was probably set more like a year in advance). Win2K didn't launch until February of 2000. At best they probably had an early alpha or beta build when they were doing hardware development of the boards. You can't blame ATI entirely for Microsoft's boneheaded rendering API -- the cards were built for Win98, and then Win2K changed the rules.

NVIDIA's blatant cheating with 3DMark03 (hardcoding their drivers around the exact benchmarks to boost their scores), and their highly questionable marketing on the PVP chip (revising the feature set six months after release), for instance?

Quack

It's hardly "cheating" when the next driver revision fixes the graphical anomolies and still has the performance improvement.

For that matter, I currently think of any AF bench from either ATi or nVidia to be 'cheating' as they are both using considerable short cuts which negatively effect IQ instead of rendering it properly- but again that is a marketing type thing.

I suppose MSAA is "cheating" too? :disgust:

Nothing like promising a high end part will function with an OS you know for a fact won't.

I don't recall them aggressively marketing the board as a Windows 2000 part (especially considering that Windows 2000 wasn't out yet when it launched). I don't recall a whole lot of gamers jumping all over Win2K right around its release in any case.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
People still think Quack was a cheat? Hmm.

Most of us think "Quack" exposed the cheat:
http://www.hardocp.com/article.html?art=MTEx
There are some obvious differences in the quality of the screenshots if you care to examine them closely.

The Facts As We See Them:

It certainly seems to us here at HardOCP that ATi has in fact included application specific instructions in their version 5.13.01.3276 Win2K drivers that make Quake 3 arena benchmarks faster by up to over 15%.

Whether they later came up with faster drivers with better image quality is fairly irrelevant; that's always the case. The fact is the gave reviewers drivers with application specific "optimizations" that significantly reduced image quality in an effort to make the reviews of the card more favorable than they would have been.

If they felt the results were attainable without image degradation, they should have waited till this was possible, or noted it to the reviewers.

Yep. They cheated to sell cards and fixed it after the fact.
 

Ackmed

Diamond Member
Oct 1, 2003
8,487
532
126
Originally posted by: Rollo
Originally posted by: Ackmed
People still think Quack was a cheat? Hmm.

Most of us think "Quack" exposed the cheat:
http://www.hardocp.com/article.html?art=MTEx
There are some obvious differences in the quality of the screenshots if you care to examine them closely.

The Facts As We See Them:

It certainly seems to us here at HardOCP that ATi has in fact included application specific instructions in their version 5.13.01.3276 Win2K drivers that make Quake 3 arena benchmarks faster by up to over 15%.

Whether they later came up with faster drivers with better image quality is fairly irrelevant; that's always the case. The fact is the gave reviewers drivers with application specific "optimizations" that significantly reduced image quality in an effort to make the reviews of the card more favorable than they would have been.

If they felt the results were attainable without image degradation, they should have waited till this was possible, or noted it to the reviewers.

Yep. They cheated to sell cards and fixed it after the fact.


The fact is, we dont know what the truth is. Different perceptions of the information we have doesnt mean we know 100% the truth. Who says they knew the problem was there? The drivers they gave reviews where the same as they shipped.

I too was upset and thought that ATi had cheated when all the news hit the net. I had just downgraded from a GeForce3, to a 8500 128meg, because of the GF3's horrible dual display support. I dont know if it was drivers, or hardware, but it sucked compared to the 8500, so I switched. So I wasnt happy with this news.

Once the dust had settled, and new drivers came out, its hard to call it a cheat. The biggest fact that supports the bug theory, is that within a few weeks, ATi released a driver that fixed the image quality problems, yet the new drivers were even faster. The drivers with the problem, were the first ones out, the ones that actually shipped with the card on the cds. They had just made the change from the R100, to the R200, and its well known the 8500 series drivers sucked for the first few months, as they had many bugs. Some people like to remind everyone that the drivers sucked for the 8500, yet they cant believe that "Quack" was a bug? Why is that?

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Whether they later came up with faster drivers with better image quality is fairly irrelevant; that's always the case. The fact is the gave reviewers drivers with application specific "optimizations" that significantly reduced image quality in an effort to make the reviews of the card more favorable than they would have been.

If they felt the results were attainable without image degradation, they should have waited till this was possible, or noted it to the reviewers.

Yep. They cheated to sell cards and fixed it after the fact.

The next driver revision fixed the graphical anomolies but kept the performance benefits. ATI said it was a bug, and given the result with the next set of drivers, I'd be inclined to believe them on this one. It also affected normal gameplay as well as benchmarking, further making it seem like an unintentional side-effect of the optimizations. Finally, these weren't special drivers they gave to the reviewers, but the shipping drivers that went out with the first wave of cards.

If you want to call all application-specific optimization "cheating", then that's fine, but all videocard companies have application-specific optimizations, and so they all "cheat" under that definition.
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
Humey, you DO realize that by comparing a 430mhz clock speed on the G70 to a 425mhz clock speed on a 6800UE, you're completely forgetting the fact that video cards don't get their fillrate from clockspeed alone? The G70 will debut with more pixel pipelines, which will let it push more data through than a 6800UE, despite a "meager" 5mhz clockspeed advantage.
 

krimsal

Senior member
May 12, 2005
246
0
0
check and mate avalon. Clock speed isnt everything. "Don't tell me the ending without seeing the movie."
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
If the information is even correct to begin with.

Since there's still no official word from nVidia directly, it's all speculation (and personally, I think the power ratings AND the burned card is a load of bunk, but that's just me).
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Maybe there should be WindowsXP nVidia Edition and WindowsXP ATi Edition, each specialized for optimal performance.
 

piddlefoot

Senior member
May 11, 2005
226
0
0
as l understand it nvidea will be ahead in tech terms by a smiggy , they have a smaller die ,than ati, usually means less volts more production, less heat , lower clockspeeds....
Intel thought this was a rule till there last die shrink where the cpu s started to generate more heat, had to lower core and loose productiveness, but ati and nvidea is like the intel and amd argument, no clockspeed isnt everything, in fact its archutecture of the cpu that makes the difference, and same in vid cards to a piont, ati have had to catch up so to speak, and have done well, nvidea have produced some groundbreaking tech in video cards, they seem to run cooler and have less driver issues [ not ocing] where ati you better keep it cool, or it ll melt [ and l melted an x800 pro fried a ram module and left heat damage on gpu plate...] but it will never stop vnidea v ati ,,,amd v intel, gamers can use either and for the record each pc system will always have a prefferece ,but to find out YOU need to test it on YOUR system as theres so many things that effect performance, a top of the range ATI will be the best you can get and a top of the range nvidea will be the best you can get, working out which one YOUR system will like the best, well have fun...........................
 

ssvegeta1010

Platinum Member
Nov 13, 2004
2,192
0
0
Originally posted by: Avalon
Humey, you DO realize that by comparing a 430mhz clock speed on the G70 to a 425mhz clock speed on a 6800UE, you're completely forgetting the fact that video cards don't get their fillrate from clockspeed alone? The G70 will debut with more pixel pipelines, which will let it push more data through than a 6800UE, despite a "meager" 5mhz clockspeed advantage.

Exactly.
Is the 6600GT faster because it runs at 500mhz clock?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
According to ATi the card is just fine- it is operating within specs.
ATi told you the crashing was normal?

It has done this since it was new
It might've been wise to RMA it at that point then, don't you think?

I was the stupid one thinking that they sold a reliable product
Again, if every 9800 Pro shipped with a faulty cooler everyone would be having problems with it. Why is it so hard to believe you may have a faulty card?

I have been informed by ATi that they offer no warranty that any game will ever work on their hardware either
Uh, no hardware vendor ever guarantees software. For that matter 99% of software vendors don't either and it's one of the first exclusions they include in the EULA.

although that took an awful lot of ripping them about Sacrifice before they would say anything at all.
Just out of interest have you tried that game on new nVidia hardware (say NV4x)? I think we had this discussion before and the problem pointed to a lack of a W-buffer which I'm not so certain new nVidia cards support either.

As for driver bugs, you don't want to go down that road. Not after the stories I could give you about the 6800U.

You want to see something broken with a R9800Pro fire up Sacrifice using any driver revision.
I'm sorry, how did we make the leap from rendering glitches being a hardware fault?

Is nVidia's DEP issue that lasted 6+ months and caused spontaneous system reboots a hardware problem too?
Are the random system reboots in games such as Halo, Chaser and JK2 proof of hardware defects?
Are single digit framerates in COD and multi-second pausing in SOF2 with 4xAA enabled also evidence of such?

I guess nVidia is far from "perfectly functioning", huh?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Most of us think "Quack" exposed the cheat:
Yeah? And what do "most of us" think about nVidia's exposed 3DMark and UT2003 cheats? Or how about nVidia's massive shader replacement in Doom 3 which we have Carmack's testimony to back that up?

You don't need to answer that because I remember your dismissive antics quite clearly when the evidence was being discussed.

Whether they later came up with faster drivers with better image quality is fairly irrelevant; that's always the case.
No, it isn't always the case.

The fact is the gave reviewers drivers with application specific "optimizations" that significantly reduced image quality in an effort to make the reviews of the card more favorable than they would have been.
You mean like nVidia? Funny I don't ever recall you harping on about that, probably because you were too busy "kicking ass online at 1024x768" with your silent 5800, the one you had purchased for the third time while lambasting ATi that you refuse to buy the same tech repeatedly.

If they felt the results were attainable without image degradation, they should have waited till this was possible, or noted it to the reviewers.
Except they didn't know they had a bug.

In case somebody's memory has slipped, here is a refresher of the cheats FutureMark found. Even if we assume Quack is a cheat (which it isn't) it's nothing more than a drop in the bucket compared to what we've seen from nVidia.

What Are The Identified Cheats?
Futuremark?s audit revealed cheats in NVIDIA Detonator FX 44.03 and 43.51 WHQL drivers. Earlier GeForceFX drivers include only some of the cheats listed below.

1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the workload. However, if the loading screen is rendered in a different manner, the driver seems to fail to detect 3DMark03, and performs the back buffer clear command as instructed.

2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case the driver uses instructions contained in the driver to determine when to obey the back buffer clear command and when not to. If the back buffer would not be cleared at all in game test 2, the stars in the view of outer space in some cameras would appear smeared as have been reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so that the back buffer is cleared only when the default benchmark cameras show outer space. In free camera mode one can keep the camera outside the spaceship through the entire test, and see how the sky smearing is turned on and off.

3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds two static clipping planes to reduce the workload. The clipping planes are placed so that the sky is cut out just beyond what is visible in the default camera angles. Again, using the free camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51 drivers as far as we know.

4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.

5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the shader is being totally discarded and replaced with an alternative more efficient shader in a similar fashion to the water pixel shader above. The rendering looks similar, but it is not dentical.

6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection proved to reduce the frame rate with these drivers, but we have not yet determined the cause. Page 4 of 7

7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection drops the scores with these drivers. This cheat causes the back buffer clearing to be disregarded; we are not yet aware of any other cheats.

8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the driver. When we prevented this detection, the performance dropped by more than a factor of two in the 2.0 pixel shader test.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
You mean like nVidia? Funny I don't ever recall you harping on about that, probably because you were too busy "kicking ass online at 1024x768" with your silent 5800, the one you had purchased for the third time while lambasting ATi that you refuse to buy the same tech repeatedly.

LOL

BFG, I came l l this close to buying a 4th 5800Ultra a couple nights ago. I saw some for sale on Ebay and the compulsion to buy another was almost overwhelming.

Say what you will about the Dustbuster, but no other card has fascinated me to the extent that one did.

Have I ever wanted another 9700Pro? LOL- yeah right.

BTW-
Except they didn't know they had a bug
.
I guess we'll never know for sure because neither of us work for ATI. You choose to believe their damage control, I don't. I notice you didn't bother to list ATIs 3dmark cheats either?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Rollo
LOL

BFG, I came l l this close to buying a 4th 5800Ultra a couple nights ago. I saw some for sale on Ebay and the compulsion to buy another was almost overwhelming.

Say what you will about the Dustbuster, but no other card has fascinated me to the extent that one did.


Personally, I predict the 5800 Ultra becoming as big a sought after collector's item as the unreleased VooDoo 5 and 6 series cards are now.

That card is as infamous as they get - the people who have a shelf of framed PCBs will be PINING for one in a year or two - it's already starting.


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ronin
Guess I'm glad I've got 4 or 5 already then.

Bastige!
I've had 3 (although the rare and beautiful Abit 5800 OTES arrived DOA and I wouldn't do a fraudulent RMA through the seller) but sold them as they were on a shelf.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Um, working the way it is supposed to (sort of like mine does). If there was a widespread problem with 9800Pro cooling, don't you think it would have maybe shown up in more than a handful of cases?

Check out back when the R9800Pro was new and the huge amount of threads of people replacing the stock cooler because it was inadequate.

I never personally used an R100 or R200, but I knew several people with R95XX/R97XX cards that didn't have any problems with them.

I ran a R9500Pro right after launch and roughly half my games did not work due to KNOWN bugs. I had to swap drivers depending on what game I wanted to play.

How long are you going to harp on this one game? That's how many years old?

First off, the game is newer then Quake3- last I checked it was also still in the top 25 all time games. WTF should it matter anyway- why can't it work?

And have you considered that maybe the bug is not their fault?

It is their fault, and they know that. They would have to remove some.... 'optimizations' in order to get it to work properly.

Did you want them to recall all the old boards so they could change the boxes or something?

You ship out a sticker to place on the merchandise as a modification to the specifications or you do a full recall. I work in distribution- this is done regularly by every remotely honest company.

You can't blame ATI entirely for Microsoft's boneheaded rendering API -- the cards were built for Win98, and then Win2K changed the rules.

Win2K is build on the NT platform which had the exact limitation that Win2K does period. NT was out quite some time prior to Win2K and this doesn't have anything to do with the API- who the he!l filled your head with that nonsensical drivel?

It's hardly "cheating" when the next driver revision fixes the graphical anomolies and still has the performance improvement.

They were using application detection and building in a shortcut for that particular game- exactly what nVidia did. Apologize away the earlier example. Sorry- I honestly don't think any of them were that big of a deal(and the archives will back me up) just that the lunatic fringe fanatics have an enormous double standard.

I suppose MSAA is "cheating" too?

As a form of AA no- as FSAA absolutely. FSAA is full scene anti aliasing, MSAA isn't remotely close to that.

I don't recall them aggressively marketing the board as a Windows 2000 part (especially considering that Windows 2000 wasn't out yet when it launched). I don't recall a whole lot of gamers jumping all over Win2K right around its release in any case

Very clearly you weren't here.

BFG-

ATi told you the crashing was normal?

They tried telling me I didn't have enough juice in my PSU(420Watt), then they tried to tell me I didn't have a mobo that supported AGP 8x(nForce2) and then they told me I didn't have enough cooling in my case(ELEVEN fans).

It might've been wise to RMA it at that point then, don't you think?

I got it working- and the only thing a RMA would have gotten me was another R9800Pro- and then they were regularly shipping 128bit mem bus replacement parts for defective boards. Trust me, I checked in to everything very thoroughly.

Again, if every 9800 Pro shipped with a faulty cooler everyone would be having problems with it.

There were two different R9800Pro cores- I got the bad one.

Why is it so hard to believe you may have a faulty card?

ATi employees and enthusiasts all telling me it was normal operation. I posted about all these issues back when I first got the board.

Uh, no hardware vendor ever guarantees software.

And if not a single game works on the board ATi will refuse to give you a refund- keep that in mind.

Just out of interest have you tried that game on new nVidia hardware (say NV4x)?

Yes, works just fine on new nV hardware.

As for driver bugs, you don't want to go down that road. Not after the stories I could give you about the 6800U.

You tell me a single bug ever that lasted over a year on a nV part- go back to the NV1 if you would like. That's just a year, but it really isn't fair to compare the little league driver team of ATi to nVidia.

Except they didn't know they had a bug.

Your hypocrisy is utterly disgusting. You went off like a rabid animal about any sort of application detection being cheating for how long and now you are a full blown ATi apologist. Your lack of logic is dumbfounding on this subject.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: DaveA
Originally posted by: humey
Thats the reason the Japs have 900watt psu avail, but 700watt is bit more easyier to et today, we can get a 680watt Thermatake Purepower, not sure if it is enough to SLI 2 them cards and im 90% sure thats not the flagship, you wait till ATI launch, Nvidia always have a Ultra, there is no way flagship is a 430 core GPU.

dumb ass post of the year. 6800 ultra is only 425mhz core.

the 6800 isnt a G70.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |