Worst GPUs of All Time?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Thinker_145

Senior member
Apr 19, 2016
609
58
91
lol at people calling single GPU kings as the worst cards ever wtf?

Anyways I think its the 2900xt because the damage it gave to ATI they never really recovered from it. The new generation of consoles meant DX9 cards were all crippled and 8800 became the defacto PC gaming card.

The 4870 came too late to the party Nvidia had already established themselves as the GPU brand of high end gamers and it remains so today. They did manage to win back mid range gamers but failing at getting the high end gamers has cost them dearly.

The 5870 was probably their best moment but Nvidia had built such confidence among gamers that enthusiasts waited for them to reply and reply they did.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,070
7,492
136
For the unified shading era, it's easily the 2900xt.

Aside from all the technical limitations that have been stated above, the biggest impact of this card was cementing for both AMD/ATI and the consumers that this was a value brand that played second fiddle to NV.

I think this card really scarred AMD and directly lead to the "sweet" spot strategy that ceded the top end to NV for the next 7 card generations.

It wasn't until the Fury X that AMD tried to compete for maximum performance again.

Hopefully AMD is committed to recapturing the performance crown despite the the setbacks the Fury faced against the 980ti.
 

HiroThreading

Member
Apr 25, 2016
173
29
91
lol at people calling single GPU kings as the worst cards ever wtf?

Anyways I think its the 2900xt because the damage it gave to ATI they never really recovered from it. The new generation of consoles meant DX9 cards were all crippled and 8800 became the defacto PC gaming card.

The 4870 came too late to the party Nvidia had already established themselves as the GPU brand of high end gamers and it remains so today. They did manage to win back mid range gamers but failing at getting the high end gamers has cost them dearly.

Dual GPUs are known to be flakey, so it's not a surprise when they run into driver issues.

I think the AMD buy out damaged ATI's image more than R600. AMD just has a trash image amongst enthusiasts and PC gamers, and I have no idea why AMD decided to get rid of the ATI brand name so quickly after the buyout.

Do you think Lamborghini or Audi would have as much of a following or command such high price premiums if they were renamed as Volkswagen and had VW logos stamped all over them?
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Dual GPUs are known to be flakey, so it's not a surprise when they run into driver issues.

I think the AMD buy out damaged ATI's image more than R600. AMD just has a trash image amongst enthusiasts and PC gamers, and I have no idea why AMD decided to get rid of the ATI brand name so quickly after the buyout.

Do you think Lamborghini or Audi would have as much of a following or command such high price premiums if they were renamed as Volkswagen and had VW logos stamped all over them?
I am not talking about dual GPUs but people are mentioning the GTX 280 and 480 among the worst when they secured the spot for top single GPU at launch which is such an invaluable achievement to have for any card. To even think about mentioning them as some sort of worst card is ridiculous.

And yes 100% agreed about the branding. Absolutely atrocious move.
 

tajoh111

Senior member
Mar 28, 2005
305
323
136
The issue is that R600, besides the ridiculous 512 bit memory bus, was actually well engineered. As I said before, most of the issues of R600 were due to TSMC's disastrous 80nm process, causing ATI engineers to downclock the card by about 200MHz in order to keep power consumption in check. Oh, and I'll give you the fact that they completely f'd up the MSAA hardware -- even RV670 didn't address that issue iirc.

I'm not sure I have such a rosy memory of Fermi as you do. I think you're mostly referring to the GTX 580? That was very much a decently polished version of Fermi. Power consumption was a lot better, thermals were improved and performance was very good.

Also, Fermi only did well in CUDA apps. In raw compute and dual precision FP, Cypress, Cayman and Tahiti were much better iirc.

And yeah Tonga was just bizarre. It got them the Mac sales due to support for better IQ, but AMD marketing completely botched the 285X launch.

Did you click the review I posted? Fermi had insanely high IPC, higher than chips we have today. All chips today have sacrificed some IPC for raw flops. But Fermi was a super IPC that made each flop count and this is reflected in benchmarks. It only had 448 shaders, but it stomped the v8800 which is based on the 5870 GPU. AMD high flops were a result of the cores being simple and small, which meant you could get really high raw flops, with adding little die area but they were not very efficient. They were simpler shaders and utilization in most scenarios would be small. As you scaled up with VLIW5, utilization when down. Even AMD recognized this and as a result, made vliw4, which had higher IPC but basically the same flop rating(actually a tad less).

It's why Nvidia was competing and beating AMD, with less than half the raw flops even in games. The gtx 480 was a 1.345 flop card while the 5870 was a 2.705 flop card, but the former is the faster card. This was even more pronounced in professional applications. And this was a very disabled card running at low speed. The quadro 6000 was a compute card first and a game card second. Didn't have the highest flops but it made every flop count which is more important particularly when your beating the competition by 100% quite often.

Most of those app's aren't cuda ones and are simply commonly used professional apps or programs used to measure performance.

http://hothardware.com/reviews/amds-new-firepro-w8000-w9000-challenge-nvidias-quadro?page=4

It wasn't tahiti where AMD focused on IPC and could actually do anything with their flops. The cores became larger, IPC went up and AMD could compete. But we didn't get a huge increase in cores.

Generally the larger the core, the better it is at professional applications.

And no I was talking about the gtx 480.

https://www.techpowerup.com/reviews/AMD/HD_7970/28.html

The gtx 580 was like 38% faster than a 5870, not the 20-21% faster I mentioned for the gtx 480. It captured the performance crown and this grew in time. AMD second gen 6970 is basically just the same speed as it. The 2900xt fell on its face and was a distant second to Nvidia g80 and used more power to boot. Much like the fx 5800. The gtx 480, although using more power, did get the performance crown and while doing it, cemented Nvidia's lead and reputation among the professional market. Something the 2900xt remotely accomplished. It almost lead AMD GPU department to financial ruin in fact.

To compare the gtx 480 and the 2900xt doesn't make sense. The 2900xt was badly engineered from the beginning, the problem with the chip is AMD didn't realize that the IPC for VLIW5 was so low, that it was super underpowered, even with higher flops. What it needed was a smaller and simpler memory controller and more shaders. Even with higher clocks it would have sucked because the card would be drinking juice like no tomorrow. What AMD did with the 4870 is they threw out the original organization of the 2900xt and reconfigured it to be shader heavy to account for VLIW5 lower IPC. This wasn't a minor polishing like between the gtx 480 and 580. There was a major reorganization with VLIW designs after the 2900xt, hence the engineering of the 2900xt was no good from the beginning.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,227
153
106
The Intel 740 story is pretty interesting. It was very close to being great, but not quite.

https://en.wikipedia.org/wiki/Intel740

There was only ONE version of the i740 that was good, the Real3D Starfighter. I got mine as a free review and was the only one that had OpenGL support. I tried a couple other i740 cards and would only offer 3D game support for those in D3D.

There were some games like Wing Commander Prophecy and early Unreal that were 3DFX or bust. My Canopus Pure3D (the only one with the extra video RAM!) came in handy there!


...anyone remember the unreleased Voodoo6 card with 4 GPUs+fans on it! Nuts!
 

Blue_Max

Diamond Member
Jul 7, 2011
4,227
153
106
S3 Virge graphics decelerator. One of these came with my family's home computer in the mid 90s, I don't think there was one game that I could get it to work with.

I can think of only ONE game; Descent. And using the Virge in 3D mode increased the graphic quality notably - but ran even SLOWER than software mode!
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
There were some games like Wing Commander Prophecy and early Unreal that were 3DFX or bust.

Yeah that's what made me love my Voodoo 2. Glide was Mantle's granddaddy. Not only did almost every popular game have support, but I was into emulators big back then (still am actually) and the first decent N64 emulators used Glide. It was a fun era.
 

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
FX 5800 Ultra. It holds the crown for the biggest and loudest clown. Such a sillyass GPU it was. "Oh, buy it now for its amazing cinematic computing revelation". It was slower than the already existing 9700 Pro and sounded as loud as the creation of a universe taking place in my bedroom. It doesn't get any louder than that. That's the loudest thing. A universe being born in your room.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
The fx5800 was arguably more competitive in performance because it could take the lead in a few benchmarks. The 2900xt couldn't.

No it wasn't competitive for anything that mattered- the FX5800 was an utter disaster for anything which required DX9.

At that time Nvidia had been generally ahead of AMD for a while with the Geforce 3 and Geforce 4 series,and it came as an utter shock when the FX was released. The 9700 PRO came out of the blue - nobody expected ATI to launch such a card. ATI by that point had been written off as not being able to beat Nvidia at the high end.

The 9700 PRO was the card that actually proved that ATI could actually compete in the high end market and it changed people's perception of what they could do.

The non standard DX9 implementation in the FX series meant it could not even run HL2 properly - Valve had to not only made an alternate DX9 path for the FX5800 but had to degrade image quality too. You undersestimate how big a title HL2 was at the time and even Valve basically told people to get an ATI card for the game.

It was a title people had waited six years for and the premiere graphics card company at the time,Nvidia,could not make a card which could run it properly in DX9.




That is from a Valve presentation at the time - they basically could not recommend using the FX series for HL2.

This was the first time many people(including myself) actually bought an ATI card,and friends who bought an FX card were so dissapointed they ended up buying an ATI card the next time.

Plus the backlash for so bad for the FX series,the follow up X800 series apparently was the only time ATI managed to get more marketshare than Nvidia even though the Nvidia 6000 series was technically better. Not even AMD could do that with the HD5870 launching six months before the GTX480 and the HD5850 launching nine months before the GTX460.

Nvidia even mocked the FX5800 with the following video:

https://www.youtube.com/watch?v=H-BUvTomA7M

The 2900XT was comparative less of a dissapointment and it was pretty crap - ATI never had the same superior run Nvidia had prior to the FX for two straight generations as Nvidia had literally dominated ATI at the high end before the FX.The FX was a shock to the system,the 2900XT was more an utter dissapointment but so was the X1800 series too. Even though the X800 series outsold the 6800 series,the fact is the latter were technically superior,and it was only really the X1900 series which had an edge in reality. So in the three series between the FX and the G80,Nvidia had the edge in two of them technically.

No body had written off Nvidia like ATI had been written off just before the 9700 PRO was launched.
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The issue is that R600, besides the ridiculous 512 bit memory bus, was actually well engineered. As I said before, most of the issues of R600 were due to TSMC's disastrous 80nm process, causing ATI engineers to downclock the card by about 200MHz in order to keep power consumption in check. Oh, and I'll give you the fact that they completely f'd up the MSAA hardware -- even RV670 didn't address that issue iirc.

R600 was a disaster imo. Calling it well-engineered and blaming the 80nm process as its primarily fault I think doesn't explain the whole picture. The card packed huge memory bandwidth for its time but lacked texture/pxiel fillrate due to the lack of TMUs/ROPs. Unless it was clocked considerably higher to offset for the no of units, the G80 based cards would still have the upper hand. Not just that to top it all off, the stock cooler was awful on the 2900XT.

They even admitted that they were only aiming for 30% extra performance over the X1950XTX iirc alal B3D David Baumann. It was late and nVIDIA did too much damage during this time especially when the 8800GT first appeared.

The x800 and x600GT series from nVIDIA I think is what really captured the minds of the gamers and essentially became the foundation of the mindshare today imo. ATi really never had the proper answer either than cards like the X800XL. The 6600GT, 7600GT, 7800GT, 8800GT etc were like best bang per buck cards during their hey day.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
R600 was a disaster imo. Calling it well-engineered and blaming the 80nm process as its primarily fault I think doesn't explain the whole picture. The card packed huge memory bandwidth for its time but lacked texture/pxiel fillrate due to the lack of TMUs/ROPs. Unless it was clocked considerably higher to offset for the no of units, the G80 based cards would still have the upper hand. Not just that to top it all off, the stock cooler was awful on the 2900XT.

They even admitted that they were only aiming for 30% extra performance over the X1950XTX iirc alal B3D David Baumann. It was late and nVIDIA did too much damage during this time especially when the 8800GT first appeared.

The x800 and x600GT series from nVIDIA I think is what really captured the minds of the gamers and essentially became the foundation of the mindshare today imo. ATi really never had the proper answer either than cards like the X800XL. The 6600GT, 7600GT, 7800GT, 8800GT etc were like best bang per buck cards during their hey day.

ATI actually outsold the 6000 series with the X800 series:

http://cdn.wccftech.com/wp-content/uploads/2015/04/Nvidia-AMD.png

It was even more shocking as the 6000 series was technically more superior. It shows you how much of a disaster the FX was that it meant it took until the rather delayed X1800 series and the Geforce 7 series for Nvidia to start having more sales than ATI again.

So let me repeat that again - it took Nvidia having better cards for two generations and for ATI to delay their first DX9C capable card for Nvidia to start selling more cards than them again.

That is the effect the FX had on actual sales - it took nearly two generations and a screwup from ATI to help them.

ATI didn't have technically superior cards for three generations after the G80 until the HD5000 series launched,and that was because Fermi got very delayed and they were competing with the GTX200 and G92 series cards for quite a while.
 
Last edited:

bbhaag

Diamond Member
Jul 2, 2011
6,762
2,146
146
I can't really think of any but I was never a big pc gamer so I was always pretty happy with my gpu's.

On a side note I don't think some of you guys are giving the S3 Trio32/64 a fair shake. It's considered one of the better DOS gpu's around. Not just for its speed but also for its compatibility and availability. Now the v+ and Virge variants are questionable but the original Trio is a decent card for 2d gaming which is all it was ever intended to be.
 

kallisX

Member
Sep 29, 2016
45
39
91
The RX 460 is controversial, but it's simply slow, overpriced and doesn't overclock. It's always at the bottom of performance charts even when the RX 480 and RX 470 do very well.
eh? the 460 easely bested gtx750 by over 30% in performence and about the same in price at this point.
ofcourse its at the bottom of the charts, when its the cheapest card worthy of gaming out there!
 
Feb 25, 2011
16,823
1,493
126
lol at people calling single GPU kings as the worst cards ever wtf?
In the cases where a particular brand's single GPU flagship of a particular generation was substantially behind the competition, failed to improve (or in a few cases, was inferior to) the product it replaced, or where it sucked as a product, I think the scorn is justified.

The initial GTX 5800 was FAST, but it was not only loud, but was the first flagship card that wasn't really available w/o a dual-slot cooler. Third party coolers and the 5900 fixed this to some extent.
The 2xxx series Radeons were an overhyped, underperforming mess.
Iirc, more recently, the GTX 5xx series was a bit underwhelming.
The GTX 960 wasn't a compelling upgrade if you had a 660 Ti.

etc.

Although I'm sure there's also plenty of "I can't afford it so it sucks" style sour grapes.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
eh? the 460 easely bested gtx750 by over 30% in performence and about the same in price at this point.
ofcourse its at the bottom of the charts, when its the cheapest card worthy of gaming out there!
It's not cheap enough or performant enough for a Finfet card IMO, and sometimes it really eats it. Plus the 2GB model is alarmingly and unnaturally slow. The 480 and 470 are on a completely different level for finfet improvements, and the 460 does not bring anything new to the no power connector market either, when the 950 without a power connector already exists.

The RX 470 and 480 at 232mm^2 are replacing Hawaii which are around 460mm^2, but the RX 460 is still not managing to replace Pitcairn in the same way, with a 4GB aftermarket 460 barely beating the R7 370 4GB and the 2GB model completely getting thrashed. It's not the VRAM either; a 2GB 950 slots in between the 370 and 4GB 460 in this example.

https://www.computerbase.de/2016-08/radeon-rx-460-test/3/

There's also its overclocking or lack thereof. Techpowerup managed squeeze 2.4% extra out of their card. Not impressive at all. The RX 480 and 470 aren't overly impressive in this area, but they can be at times and are on average far better than that.

Finally, there's price; we've established that the 2GB 460 is super slow, so we can't really rely on getting the desired performance on one. 4GB 460s go for $124. To be fair, it's cheaper than 2GB 950s, however the 4GB 1050 Ti is available for $129 right now and the 1050 Ti is no less than 35% faster.

Even if we say that's a fluke and they normally go for about $140-$150, we're still getting 35% extra performance for 21% or less price. Both don't consume much power, but for what it's worth the 1050 Ti is more efficient and it has about 8-9% OC headroom.

So what advantage does this card have? It does badly even when other Polaris cards do well, it's absolutely smashed by its competitors on performance, it's a dubious value proposition, it doesn't overclock, etc. I just can't make a case for purchasing one over a 1050 Ti. Much like a 1050 Ti is somewhat undermined by the higher price RX 470, so is the RX 460 undermined by the higher price 1050 Ti.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
There's also its overclocking or lack thereof. Techpowerup managed squeeze 2.4% extra out of their card. Not impressive at all.

The card they are using basically has a 2% overclock over a base model 460, so another 2.4% equals about what any Polaris can OC. Mine can do a little better than theirs can (about 4%), OC is always a lottery.

So what advantage does this card have?

Personally I appreciate how it has excellent open source drivers in Linux (maybe the best open source GPU drivers ever), the fact it can mine crypto on less than 70 watts (in ways the 1050 can't), and the fact it works with hackintoshes (Pascal can't). None of those are mainstream uses, and I probably wouldn't recommend it to a pure Windows gamer, but the card does have some advantages.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |