Most spectacular failure in video card history

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dug777

Lifer
Oct 13, 2004
24,778
4
0
meh, x-fire; a) is a chipset not a video card, b) has only just been launched so unless Rollo has a crystal ball it's far to early to tell & c) is actually a whole platfrom that will extend to future r5xx cards THAT won't be res limited as i understand it...

as i said before, the XGI Volari V8 Duo is teh winnar imo, and if anyone has one i'll gladly buy it, i've always wanted one
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
How was the Ti4600 pants?, No-one had a card bar 3dfx that could come close to Nvidia until ATI bought out the 9xxx series (when they got the drivers right) GF1,2 & 3 kicked everyones butt!

As for crossfire, it is a technology, a belated one, as it has already been done, and the board clocking with the crossfire chipset has nothing to do with this thread!, although it looks like a great OC chip to have on the aM/B. However, ATI is as always, playing catch up to Nvidia...duh
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
Originally posted by: CaiNaM
of course i don't know why rage fury maxx isn't included either (and frankly x-fire doesn't belong on this list -- not yet anyway)
Rollo liked the Rage MAXX . . .

that's why
:Q

it was an abomination that couldn't live up to ATI's promise to make it work with Win2K

Teh MAXX was teh roxor.

ATIs Rage Fury chip was WAY behind the GF1 in driver quality and speed. So what does ATI do?
Cobbles two of them together on a board to make the a close second place card where two inferior GPUs render every other frame.
It was a totally cool solution that would have went over a lot better if they could have worked out those nagging details like synching the AFR, flashing textures, and Win2K.

I didn't use Win2K, so I only experienced two of its weaknesses. I can tell you this- I'd take flashing textures at times and jumpy fps over looking at 60Hz flashing in front of me like some 70s disco.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dug777
meh, x-fire; a) is a chipset not a video card, b) has only just been launched so unless Rollo has a crystal ball it's far to early to tell & c) is actually a whole platfrom that will extend to future r5xx cards THAT won't be res limited as i understand it...

Au contraire mon frere.

With SLI you could accurately say the video cards are out of the loop, but Crossfire has the technology bolted on to the side of a Crossfire "Master" card.

So I feel correct listing Crossfire as a video card debacle, the cheesy 16X12 limiting chip resides on a video card.

LOL Crossfire for X800s may as well be a high school science project. I don't think I've ever seen a bigger turd actually brought to market.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Rollo
Originally posted by: dug777
meh, x-fire; a) is a chipset not a video card, b) has only just been launched so unless Rollo has a crystal ball it's far to early to tell & c) is actually a whole platfrom that will extend to future r5xx cards THAT won't be res limited as i understand it...

Au contraire mon frere.

With SLI you could accurately say the video cards are out of the loop, but Crossfire has the technology bolted on to the side of a Crossfire "Master" card.

So I feel correct listing Crossfire as a video card debacle, the cheesy 16X12 limiting chip resides on a video card.

LOL Crossfire for X800s may as well be a high school science project. I don't think I've ever seen a bigger turd actually brought to market.

certainly i fail to see why ATI even bothered with it for the x800 series...to me it looks like the remnants of a prototype tech demo (intended to be fully brought to life a generation later) released just so ATI can say 'hey, we can do it too folks!'...that said the crossfire mobo looks hawt, nvidia just lost their crown in the enthusiast A64 mobo market imo...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: apoppin
Originally posted by: CaiNaM
of course i don't know why rage fury maxx isn't included either (and frankly x-fire doesn't belong on this list -- not yet anyway)
Rollo liked the Rage MAXX . . .

that's why
:Q

it was an abomination that couldn't live up to ATI's promise to make it work with Win2K

Teh MAXX was teh roxor.

ATIs Rage Fury chip was WAY behind the GF1 in driver quality and speed. So what does ATI do?
Cobbles two of them together on a board to make the a close second place card where two inferior GPUs render every other frame.
It was a totally cool solution that would have went over a lot better if they could have worked out those nagging details like synching the AFR, flashing textures, and Win2K.

I didn't use Win2K, so I only experienced two of its weaknesses. I can tell you this- I'd take flashing textures at times and jumpy fps over looking at 60Hz flashing in front of me like some 70s disco.

the MAXX was universally reviled - except by you . . . i have to assume there is 'selective memory' occuring.

X-fire won't have flashing textures on LCDs . . . the 60hz limitation is fixed in future revisions . . .
edit: and you probably DID experience gaming with Win98SE at 60hz with your MAXX

OtOH Maxx was never fixed . . . rather it was abandoned by ati who reneged on its promises to its Maxx customers . . .

Look back almost a year . . . SLI had its share of limitations . . . still does . . . you aren't STILL rebooting your PC when you want to turn it on or off are you?
:roll:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: apoppin
Originally posted by: Rollo
Originally posted by: apoppin
Originally posted by: CaiNaM
of course i don't know why rage fury maxx isn't included either (and frankly x-fire doesn't belong on this list -- not yet anyway)
Rollo liked the Rage MAXX . . .

that's why
:Q

it was an abomination that couldn't live up to ATI's promise to make it work with Win2K

Teh MAXX was teh roxor.

ATIs Rage Fury chip was WAY behind the GF1 in driver quality and speed. So what does ATI do?
Cobbles two of them together on a board to make the a close second place card where two inferior GPUs render every other frame.
It was a totally cool solution that would have went over a lot better if they could have worked out those nagging details like synching the AFR, flashing textures, and Win2K.

I didn't use Win2K, so I only experienced two of its weaknesses. I can tell you this- I'd take flashing textures at times and jumpy fps over looking at 60Hz flashing in front of me like some 70s disco.

the MAXX was universally reviled - except by you . . . i have to assume there is 'selective memory' occuring.

X-fire won't have 60hz flashing on LCDs . . . the 60hz limitation is fixed in future revisions . . .
[edit: and you probably DID experience gaming with Win98SE at 60hz with your MAXX]

OtOH Maxx was never fixed . . . rather it was abandoned by ati who reneged on its promises to its Maxx customers . . .

Look back almost a year . . . SLI had its share of limitations . . . still does . . . you aren't STILL rebooting your PC when you want to turn it on or off are you?
:roll:

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
the MAXX was universally reviled - except by you . . . i have to assume there is 'selective memory' occuring.
Errr, it was a cool card, A for effort, C for performance.

X-fire won't have flashing textures on LCDs . . . the 60hz limitation is fixed in future revisions . . .
edit: and you probably DID experience gaming with Win98SE at 60hz with your MAXX
I wasn't aware Win98SE limited refresh rates?

H Maxx was never fixed . . . rather it was abandoned by ati who reneged on its promises to its Maxx customers . . .
The 16X12 at 60Hz will never be fixed either? And it's a worse problem than not working in an OS that wasn't exactly "all the rage" for gaming. In the early days of Win2K it wasn't the gamers choice.

Look back almost a year . . . SLI had its share of limitations . . . still does . . . you aren't STILL rebooting your PC when you want to turn it on or off are you?
:roll:
You've said this a couple times lately, and I have to say:Why would I want to turn off SLI? I haven't "turned it off" all year?

Beyond that, this issue will be addressed in the either the 78.03s or the 8 series drivers, I forget which. (mostly because I don't care)

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Wrong again BFG.
I noticed you side-stepped most of the questions, as usual.

I bought my X800XT PE on Dec. 14, 2004, long after SLI had been reviewed and was on sale.
So how does the purchase of a single 6800 followed by your pimpage that a 6800/6600GT SLI is viable fit into your "I dumped the ATi card to get a better SLI solution"?

More specifically, how does this fit into your "I refuse to pay for the same technology over an over again" when you purchased no less than six NV4x based cards?

Big deal- they still two year old games.
Yep and a lot of fun to play too. Of course you'd never know because you don't actually play games.

It angers you for some reason that I play current games at 19X14 4X8X?
Why should it anger me?

If I turned on the 16X AF, you'd just switch to something else to mindlessly flame.
The question is why not run 16xAF, just like anyone who knows how to drive will run in fourth gear instead of third when it's better to do so.

You know what I think? I think you're so clueless you don't actually know what settings you can run on your own hardware so you just parrot whatever settings the websites use.

This would explain your trend of sticking to 1024x768x8x4, 1600x1200x8x4 (that was an absolute joke and ironically you're now slamming crossfire for being limited at the very same resolution you voluntarily ran on your 7800 SLI setup for months), and now 1920x1440x8x4 when the likes of Anand have started using that setting.

This would also explain your refusal to accept any resolutions in between the websites' (like the 5800U fiasco where you couldn't understand there was something between 1024x768 and 1600x1200), why you never run the xS modes, and why you refuse to run 16xAF since basically none of the websites use it.

16xAF must be uncharted territory for you, eh Rollo? Don't worry, you can ask Uncle Thresh and Uncle Tom to hold your hand to guide you through it. :roll:

You might drop the BS "how many games have you finished lately" babble as well.
Why? "OMG, nVidia has soft shadows!". Tell me Rollo, how many hours have you actually logged with soft shadows? Probably less than the number you've been pimping them I'd wager.

I don't have lots of hours long stretches of time to sit down and finish games.
But you appear to have plenty of hours to preach at the forums.
 

remagavon

Platinum Member
Jun 16, 2003
2,516
0
0
The Voodoo5 was a great card at the time, just like the Voodoo4 (competed very well with the GF2MX in the same price range). T&L didn't make that large of a difference; once guardband clipping was enabled in the tweaker voodoo's ran quake3 pretty quickly, and beat the hell out of anything else using FSAA.

If they actually tweaked the drivers and were able to perfect the tiling mode in their drivers, 3dfx would have been able to pull far ahead of even the GF2 ultra when using FSAA at basically 0 performance hit. My voodoo4 was able to use agressive tiling with little artifacting and it ran fantastically. Too bad they got bought out before that could be used to their advantage.

The worst card overall was probably the savage2000, or the stealth 3 (predecessor). I had one that froze CONSTANTLY, stock settings etc, and a friend of mine had a different brand that did the same thing. Using S3 metal in UT made our systems lock up after a bit of playing; very annoying.

Another very strong contendor is the Parhelia. I'm the only one I know (in person) that ever actually owned a Parhelia, and my oh my what a piece of crap. Even the professional 2d features were offset becuase of the DEFECTIVE nature of the cards (the banding problem) which was strongly evident in any 3d situation, and even showed up on the desktop, which is ironic considering matrox's usual 2d perfection. It couldn't even run UT at any kind of playable rate using the fancy FSAA (which, I admit, was gorgeous). I really hope they eventually make a comback, but that's very unlikely.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Rollo
Originally posted by: apoppin
the MAXX was universally reviled - except by you . . . i have to assume there is 'selective memory' occuring.
Errr, it was a cool card, A for effort, C for performance.
More universally B for effort and D for Performance - F in Win2k [ati lied]

X-fire won't have flashing textures on LCDs . . . the 60hz limitation is fixed in future revisions . . .
edit: and you probably DID experience gaming with Win98SE at 60hz with your MAXX
I wasn't aware Win98SE limited refresh rates?[/quote]no . . . some game developers did . . . this is ancient history . . . Thief/Dark Engine, i think for one

H Maxx was never fixed . . . rather it was abandoned by ati who reneged on its promises to its Maxx customers . . .
The 16X12 at 60Hz will never be fixed either? And it's a worse problem than not working in an OS that wasn't exactly "all the rage" for gaming. In the early days of Win2K it wasn't the gamers choice.[/quote]Sure it will be fixed for X-fire in r520 and for sure in r580. ati NEVER fixed the MAXX - nevermind the "early days of Win2k".

Look back almost a year . . . SLI had its share of limitations . . . still does . . . you aren't STILL rebooting your PC when you want to turn it on or off are you?
:roll:
You've said this a couple times lately, and I have to say:Why would I want to turn off SLI? I haven't "turned it off" all year? [/quote]You well know the [many] other limitations - including the need for identical card BIOSes - that's WHY there is a SLI2

Beyond that, this issue will be addressed in the either the 78.03s or the 8 series drivers, I forget which. (mostly because I don't care)
What's good for nVidia is also good for ATI . . . these "issues" will be addressed in future revisions


 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
the AA/AF levels between R300 and NV30 aren't directly comparable. the NV30 looks like ass compared to the R300 at the same levels, so you have to pump up the AA/AF on the NV30, which drops the performance.

The AF on the NV30 was clearly superior to the R300- only the AA really shined for ATi in that comparison. No matter what you did you could not/can not make the AF on the R3x0 parts look as good as they did on the NV3x parts(which is sad as the NV3x parts were a huge step down from the NV2x ones).

Voodoo 5 wasn't "late", it came out about the same time as the GF2.

It was competitive with the GeForceDDR though which is what it was supposed to launch against(not after the GTS).

It was only lacking Hardware T&L

Hehe- you mean it was only lacking hardware T&L if we ignore anisotropic filter, Dot3, register combiners, EMCM and for that matter even the basic functionality to perform proper trilinear filtering on multitextured games? All of these features were in use in several of the top rated games the year the V5 shipped in- not some far off future.

The FSAA in the V5 was superior to everything on the market

But unfortunately you had to have your game details set to low too often as the V5 was completely incapable of rendering the newer effects at all.
 

earthman

Golden Member
Oct 16, 1999
1,653
0
71
Actually I would say that x-fire could very well be a success down the road. Not that I want one, one card is plenty noisy enough for me, but I am weird that way. As far as the worst vid card in history - without a doubt the trident 1 mb

I remember Trident. You plugged them in, turned the system on, and crossed your fingers hoping you would actually get a boot screen. Usually you didn't.

First gaming PC:

Diamond Monster 3D (Voodoo1) 2meg texture, 2meg framebuffer!

Diamond Monster Sound

Diamond Viper 4meg for 2d

Hard to believe these three things costed over 500 bucks at the time...

 

js1973

Senior member
Dec 8, 2000
824
0
0
I loved my Voodoo5. It made Tribes look awesome and the dual aftermarket copper heatsink/fans I attached with silver epoxy looked cool. That was my first hardware modification. Oh the memories.

The Parhelia was the first thing that came to mind when I saw this thread. The specs on that card had Matrox fanboys in a lather until the card was actually released. They thought they could finally let go of their G400's and no longer just be content with having the supposed best 2D on the market.

Wasn't the AA scheme broken on the card at release?
 

sparkyclarky

Platinum Member
May 3, 2002
2,389
0
0
Originally posted by: Rollo
BTW, my reasons for candidates:

V5- late, feature-less compared to GF2 and Radeon VIVO, strange SLI on a board card

5800 Ultra - couldn't be produced profitably, loudest OEM fan made it notorious

S3 Savage 2000- Supposed to compete with GF1, but when T&L enabled, fps went down. Not to mention texture irregularities and desktop issues. (e.g. areas of wallpaper "missing")

ATI Crossfire- high end solution launched with 16X12 at 60Hz limitation, master/slave cards, dongle


V5 was late, but not by a ton, and it had some VERY nice AA for the time, lack of hardware T&L truly didn't matter much at the time - 3dfx's main problem was moving from a supplier of chips to a producer of cards, a move which pretty much borked their finances entirely

5800Ultra was certainly late (about 6 months), the SM2 was substandard (again, not a huge issue at the time), loudness was there for sure

Savage 2000 takes the cake by far - completely hosed drivers, advertised as having T&L when it was for all intents and purposes a broken feature (simply didn't work, which is worse than the V5 case, where they didn't have/didn't advertise the feature),

Crossfire - too early to tell, isn't a single card, has some redeeming features (it still is fast...)


so, Savage 2000, by a landslide for simply having it's main feature completely broken

also, an honorable mention goes to the S3 Trident, the very first 3D decelerator (enabling the 3D functionality of the chip significantly slowed games down)
 

virtualrain

Member
Aug 7, 2005
158
0
0
Even if Crossfire isn't the most spectacular failure in video card history, it would appear by the poll in this forum that it probably qualifies as the only video card solution to be declared a failure by many observers BEFORE it's even available!

That must say something!

ps. I've had the benefit of building systems at times when one vendor was clearly hot and the other not. My last system a few years ago was built around an ATI 9700Pro which was untouchable at the time. Now my current system is built around a 7800GTX which is also a front-runner... Buy what ever card rules the day when you need it - who gives a sh!t which company makes the chipset?!
 

Kyanzes

Golden Member
Aug 26, 2005
1,082
0
76
Originally posted by: Acanthus
I would put the XGI Volari series as the most spectacular failure.

Anyone remember their "market leader by 2007" roadmap? :laugh:

Hey, it's still 2005 The market still has nearly two years left to prepare to withstand the massive attack by XGI. :laugh:
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
There are a few omissions from the poll that are well worthy of going in:

Nvidia NV2..... the one that they don't like to talk about
S3 VIRGE - something that should have pushed the 3D card forward but literally went backwards

Some of the items in the list I think should be more seen as successful failures. I.e. nothing wrong with the product but it just didn't take off.

The Voodoo5 comes under that category - the product was fast enough and although being in many ways horribly inefficient, was not a complete failure. It rightly lost out in sales though as to be honest was not the best solution.

In the poll list I picked the FX 5800 for a few reasons. Mainly because for the amount of time nvidia spent designing, etc., it was rubbish to be honest - indeed if I remember rightly nvidia missed one of their 6 month cycles out coming up with that monster. Not only was it inefficient (power wise - huge fan making lots of noise to keep it cool) it wasn't as fast as the competition the majority of the time and worse still didn't have as good a implementation of the DX9 feature set - isn't it lacking FP32 or something for example??
All that from a year of gestation and worse still being released months after ATI released the R9700......

The Savage 2000 could have almost won this had it not been for the commendable perserverance of the S3 driver department which to be fair must have been undermaned and drowning in the deluge of bugs that needed fixing. Yep, of course it was and to some degree is up to the last drivers a fairly buggy graphics card but boy did those last few drivers make the difference. Yep, the T&L was buggy (pretty sure on the hardware level) but it did eventually work without a glitch - the downside was that it was in DirectX only - OpenGL T&L was an experience of randomly appearing polygons and bits missing.... eventually T&L was switched off because apart from having limited use it was also slower than T&L from other cards and couldn't match the throughput of the Geforce.
It was also don't forget to be a good performance on a budget - the aim to outperform a GF SDR whilst only costing the same amount was finally realised unfortunately by that time the GF2 was the thing and S3 were in trouble.
Game performance was great however when it did work and the card did occasionally put one over the GF SDR, however whilst ongoing driver updates sped up the Geforce, the Savage ones fixed bugs and only raised the speed a touch.

ATI Crossfire could be a great failure in the making BUT it's only in its first version and still being worked on. To be fair many manufacturers have had times where their implementation of something was lacking. After all the TNT architecture had a few features which just plain didn't work.
We'll just have to see how it pans out. I don't see it being as successful though to be honest - this master card idea seems like something I would have expected from the 80's and early 90's.....
I can understand the approach but I think it would be better to make all the cards the same and have a jumper/dip switch on it that can select master, etc.

The Matrox Parhelia is a mixed bag. Good output options, etc., but a bit too early. A bit of further development and later release and it could have been so much better.

XGI=Trident+SIS Xabre...... good ideas and high marks for effort but it just never really happened. They'll need an equally fast product as the big boys to gain even a tiny amount of market share - same as S3 really with their new ______chrome series.
The Volari series were ok, but I think people expected too much from the company to deliver on the Duo cards (2 GPU)...... it was really just a party piece IMHO - kinda like the Voodoo5/6 card with ooodles of GPU chips....... nice idea but you could tell it just wouldn't happen.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BFG10K
I bought my X800XT PE on Dec. 14, 2004, long after SLI had been reviewed and was on sale.
So how does the purchase of a single 6800 followed by your pimpage that a 6800/6600GT SLI is viable fit into your "I dumped the ATi card to get a better SLI solution"?
Actually I went from 6800NU>6800GT>X800XTPE>6600GT SLI. You still don't get it. I don't necessarily buy things because they're better than the last and I'm on some path toward the "best"- I buy things to try them out because it's fun to get new computer hardware.

More specifically, how does this fit into your "I refuse to pay for the same technology over an over again" when you purchased no less than six NV4x based cards?
Actually I purchased TEN nV4x cards, they were a far more interesting product line. I bought 4 sli sets to bench them and post results for the board.

Big deal- they still two year old games.
Yep and a lot of fun to play too. Of course you'd never know because you don't actually play games.
That's nice, old games can be fun, but the graphics are lacking and there's a fairly large difference in playing Tribes 1 at 19X14 and playing HL2 at 19X14.

It angers you for some reason that I play current games at 19X14 4X8X?
Why should it anger me?
It certainly seems to as you waste no opportunity to note I should be playing at 16X AF, because you say so.

If I turned on the 16X AF, you'd just switch to something else to mindlessly flame.
The question is why not run 16xAF, just like anyone who knows how to drive will run in fourth gear instead of third when it's better to do so.
It's not as big a difference as 3rd and 4th gear on a car, and there is a performance difference. I suppose I could look into how much it really is, but to be honest, the games look fine at 19X14 4X8X, so I haven't really cared enough to look into it.

You know what I think? I think you're so clueless you don't actually know what settings you can run on your own hardware so you just parrot whatever settings the websites use.
You know what I think? I think you're a jealous man who wishes he had the means to play with hardware like I do. Since you don't, you follow me around yelling "He liked the 5800!" and "He uses 8X instead of 16X!" to try and discredit the work I do for this community.
Whatever you think of me, please list the other members here who went out and bought four flavors of SLI and posted benches for the community on their own time?
Who else here has an old college buddy that gets them some inside info from time to time and shares it with the community? (e.g. I was the first person on the web to post about nVs patch for the 7800 shimmer issue- you guys knew about it before anyone in the world)
To be honest, there are times petty little people like you make me want to just say "heck with it, I don't need this". If there weren't people here who seem to appreciate my efforts, I would, and they'd be left with the likes of you telling them how it's wise to buy slower video cards, if they don't have a fan. :roll:
 

mindgam3

Member
May 30, 2005
166
0
0
Why does every reply you do rollo always point out that people are jealous and that the only reason people flame you is because they wish they could play games like you do.. Endless repetition gets old!!

We are all bots.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I voted for the S3 Savage 2000, because it was actually broken in hardware and the problems it had could not be fixed in any way by the S3 driver team working on it.

Now that was a chip that coulda used a re-spin! (or two).

As for the Voodoo5, it was big and too late, and proved that 2 old school GPU's could not beat 1 new gen GPU (Hello..anyone out there?) But at least it worked and had an exellent driver team to back up any failings.

5800Ultra, another late arrival, pumped up by the PR machine and the damn 2-slot fan was pointed the wrong way!! (Hello... Hot air should be exhausted outside the case if you have the room for an exhaust slot!) I think the 2-slot design would have been well receive had they pointed it the right way. Brilliant they would have exclaimed!!

ATi crossfire, late, really not what we want (because we want the R520 bencheis already dammit!!) and proof once again (Hello) that 2 old school GPU's get schooled by 1 new gen GPU. But put 2 R520's in there and it will no doubt prove that 1 R580 will beat 2 R520's (Hello.. again!)
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,655
136
Originally posted by: mindgam3
Why does every reply you do rollo always point out that people are jealous and that the only reason people flame you is because they wish they could play games like you do.. Endless repetition gets old!!

We are all bots.

No, you are just confusing the fact that just about every video thread this debate between Rollo and BFG follows. I think maybe 2 other people will also chime in on an attempt to discredit Rollo as Bias. I feel for the guy but everyone has a little bias in them on everything and he tries his best to not let it affect his opinion but he is biased to Nvidia. I am the same way but I always put that disclaimer in when talking about the 2 of them.

Oh and the Voodoo5 6000 was the Straw that snapped the camels neck, after buying STB (refusing to sell their chips to other makers, sending several out off business) broke his back and the Voodoo4 kicked him while he was down. Poor Camel.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |