Come on NVIDIA why is it so hard for you to............

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pen^2

Banned
Apr 1, 2000
2,845
0
0
OMG, i just went back and reread your post only to realize what an quasi-illiterate moron i was being... see what sleep deprivance does for ya

now i am in complete agreement of all you have said in your first post :Q
 

Weyoun

Senior member
Aug 7, 2000
700
0
0
LOL @ BrotherMan

yeah, hopefully the next set of detonators pick up on image quality in general, especially 2D, although this isn't necessarily their fault again. They still have power of the situation though, IMHO they shouldn't let companies utilising low quality materials licence their chips, it gives them a really bad reputation....

A revamped compression algorith would also be greatly appreciated, it sure would stop a lot of you guys here b!tchin
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Oldfart:

Why are you so bent on constantly posting anti 3dfx?

As you say I haven't been posting much on that issue lately but (except for the V5 6000 for obvious reasons!), but it annoys me that whenever Quake 3 + S3TC is bought up Wingnut always has something to say about it. That issue has been beaten to death already but he just won't give it a rest. Quite frankly I'm as tired of him posting stuff about is as you are with my 3dfx bashing.

My point was that I was complementing Epic for supporting 4 API's and a good softwware renderer. I've personally played the game in every one of them. All of them work very well on the proper hardware.

Yes it is decent of them but perhaps they should have just concentrated on Direct 3D or OpenGL and made it work well. Face it, the only API that works well is Glide. I get 65 fps in Glide on my system and 52 fps in Direct 3D. It doesn't take a genius to figure out which is the primary API. As for OpenGL support, it total blows and even Epic admit that. In fact I get a crappy 36 fps with OpenGL.

He has a crap video card without 3D acceleration. Would you like to tell him that the software renderer wasn't a good idea?

That's not really my concern and neither should it be Epic's. Face it, an advanced program like Unreal requires a 3D accelerator for it to run at any playable speed. If Epic dropped the software renderer they could have spent more time optimising the other APIs and written a leaner program. I doubt your friend has a good enough CPU to run Unreal in software mode very well anyway, and if he does it's bizarre why he doesn't have a 3D accelerator to go with it.

Wingnut, I'm only going to bother to answer this statement:

Ok, this is pure bullsh*t. My Q3 benchmarks are neck and neck with a GTS at 1280x1024x32 resolution. Sure the GTS kicks it at 640x480.

No your statement is pure bullsh*t. For two reasons:

(1) There is no indication anywhere on the web that 3dfx's new drivers boost performance as is claimed in this forum. Not even Anandtech could reproduce those sorts of results.

(2) 640 x 480? That's CPU limited. What relevance does that have with video card performance? Unless you are trying to tell me that T&L does actually help? Of course you'd never admit that now would you, being a 3dfx zombie? Again you are talking pure BS.

Face it Wingnut, you accuse me of being an nVidiot yet you just can't control yourself whenever somebody posts up something about nVidia and their problems. You just have to stick your beak in and peck at the issue. Take your damn Voodoo and your supposed speed increases and play some damn games. Don't try to convince everyone else how "bad" nVidia are. You've picked your video card, now use it.

IBMer:

We will go to the sharky review for this one.

That's not the review I was talking about. I have seen at least 3 reviews where the reviewer says the Voodoo does an "interesting" rendering job with the original image. And by "interesting" they don't mean good.

MiniGL is gone and there was always WickedGL which didn't have any problems. BTW most MiniGL was also impletements into games by the designers.

I'm well aware it's gone. But what problems did we have when it was here? And don't deny there weren't any problems, unless you are zombie like Wingnut.

Pure FUD, they do have DVD hardware acceleration.

Uh, did you even read the rest of the article you linked to? This is taken from page 7:

NVIDIA's GeForce2 takes second, mainly because both 3dfx and Matrox forgot to include any video acceleration features in their respective products. Whoops, maybe next time, guys.

I'm sure you can manage to figure out what that means.

By the time these "standard" features get implamented in games the next generation cards are out.

How about true tri-linear filtering? T&L? Environmental bumpmapping?

Q3, FAKK2, SOF, UT when adding the S3TC hack all have problems with the sky because S3TC is inferior FXT1.

Let me get this straight. You are faulting nVidia for:

(1) S3TC.
(2) Using a hacked game and having problems as a result of this hack.



What part of S3 do you not understand?
What part of hacked do you not understand?

The V5 isn't that big and the V4 isn't that big either

The V5 isn't big? Almost every single review I have seen of it had the reviewer commenting on its ridiculous size.

Hmmmm.... lets see is that the only thing that a game engine is superior for. Lets see a vast open level with the Q3 engine that doesn't choke. The UT engine is a great engine and that is why people keep using it.



How old are you? Seriously?

I really tire of wasting my time arguing with biggoted 3dfx zombies. Take your damn cards and use them instead of coming in here and b*tching all the time about what nVidia have done wrong.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wingnut-

"Q3, FAKK2, SOF, UT when adding the S3TC hack all have problems with the sky because S3TC is inferior FXT1."

First, UT has no problem with the sky at all. If it is showing problems, then someone doesn't have it set properly. Also, what do you mean about the levels with red skies? They show very little difference from non compressed, clearly vastly superior to the default id settings.

Why isn't Start Trek Elite Forces on your list?

That uses the same engine as the other three yet has absolutely no problem with S3TC. This is a game engine issue, not nVidia's implementation which follows the S3TC specs. FXT1 is hacked very well to deal with Quake3, much as ATi's drivers are. If you compress lightmaps, you will have image quality issues and the V5 isn't artifact free, it just does better then the GeForce boards.

How is FXT1 working on the UT compressed textures btw? Since it is superior, and all that matters is actual games and how they work, and the V5 is clearly the best choice for UT then it must work much better then the GeForce right?

"Maybe this is a sign that NVidia doesn't want to implament the 128x128 texture occlution like ATI because their benchmarks would suffer a lot."

So breaking their drivers is now a good thing? Crippling proper support to deal with someone elses half @ssed code is now a plus? If they cripple their drivers for one game, problems may well pop up in others much like other companies.

Weyoun-

"Can someone with a lot of free time actually go through the Q3 pak file and compress all these textures, and see if it still works? If it does, no more checkerboard lightmaps, and we might see a fix for this problem yet..."

I have done it, and it doesn't fix the lightmaps. This is id's problem. The lightmaps aren't handled as "normal" textures and are compiled with the level(stored as an actual part of the map). You can see that the other light sources in the game, the torches and rockets and such, don't have any issues at all with texture compression, only the lightmaps that are part of the level.
 

superbaby

Senior member
Aug 11, 2000
464
0
0
What the heck. Unreal was developed for Glide, therefore it will run slower/worse on Nvidia cards. That's the end of it, there's no 3dfx bashing invovled, yeeesh. If you want Unreal to play nice on an Nvidia card wait for the next Unreal, it won't be written in Glide
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Then why does it play so well on a Radeon in D3D or OpenGL? Totally smokes my V3 3000 in Glide. BTW, I remember reading that Unreal was written with a software renderer. Glide and all of the other hardware renderers were added in.
 

LiekOMG

Golden Member
Jul 5, 2000
1,362
0
0
Seesh, I can't believe that you guys are complaining about Unreal using too many API's! Look at Half-life. It supports software rendering ,openGL, and D3D. Software obviously runs the worst, but is there to offer compatibility for those who don't yet have 3d accelerators. OpenGL is the "recommended" API to use. D3D was included because back in 1998 when the game was released, not all cards supported opengl fully. However, for those who might have ever tried the D3D rendering, its terrible! Yet no one seems to bash Valve for supporting more rendering options, even though one seems to work better than the others. Perhaps its not as relevant now, as most 3dcards now support OpenGL, but 2 years ago, not all cards did. So lay off Epic, and leave 3dfx alone! And try to keep in mind that Unreal began production in 1994, and was released early 1998. The engine is OLD, and still manages to keep neck and neck with id's greatest!

I rest my case
 

LiekOMG

Golden Member
Jul 5, 2000
1,362
0
0
Yep, your right. There were no 3d accelerators back in 1994-95, and so Unreal was supposed to be strictly a software rendered game, like quake was. However, just like quake1, as 3d accelerators became more apparent, Epic added support for 3d accelerators to Unreal just as id added support for opengl (actually, it was only a minigl at first, written only for 3dfx cards).
If anyone here has ever played the old Unreal beta (released 1996), you would have seen that the only option was software mode. And now look - the same engine in its latest build now supports things like large scale terrain support, motion captured skeletal animation, and T&L!
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
Go ahead and call me a 3dfx zealots, then explain why I own a Asus V7700 64MB Pure... Just because I don't feed into all the bull and just ignore problems like NVidiots, or even blaim them on someone else.

That's not the review I was talking about. I have seen at least 3 reviews where the reviewer says the Voodoo does an "interesting" rendering job with the original image. And by "interesting" they don't mean good.

I don't care what review you were talking about. In the Sharky Extreme review, the V5 and V4 were proven to be more accuarate than the Microsoft Software Renderer that was used as a reference. This makes you comment absolute FUD.

I'm well aware it's gone. But what problems did we have when it was here? And don't deny there weren't any problems, unless you are zombie like Wingnut.

Are we going to bring up problems with the Riva128 to prove a point about the GTS... learn some rhetoric.

NVIDIA's GeForce2 takes second, mainly because both 3dfx and Matrox forgot to include any video acceleration features in their respective products. Whoops, maybe next time, guys.

Has a review ever had a mistake.... Take a look at Anandtech's review.

DVD Quality Benchmark

This is where the V5 almost beat out the GTS in DVD playback quality and performance.

Also how can you deny that it even SAYS it on the 3dfx website and on the box that it has:

"DVD hardware assist: planar to packed-pixel conversion"

But of course you have owned a V5 right? So you know.....

How about true tri-linear filtering? T&L? Environmental bumpmapping?

How many people enalbe the 8-tap Anisotripic filtering(the True Tri-linear filtering) durning benchmarks. That is a loss of 5fps for the GTS. I happen to think uping the LOD Bias looks better.

The GTS can't do Environment Mapped Bumpmapping. Hmmmmm I does do Dot Product Bump Mapping which I don't think looks as good.

T&L isn't needed in games yet. When it is. ALL card manufuacters will have a real T&L engine and not the CAD T&L engine that are present now.

Do you really think the GTS is going to support High Polygon games well? Try the X-Isle demo... It looks real pretty. When it hits 2million Polygons/s the fps goes down to 20-30fps.

What happened to the 20-25Million Polygons that it is supposed to be able to do? Oh wait that is just marketing right? Just like the 1.6GigaTexels....


BEN:

How is FXT1 working on the UT compressed textures btw? Since it is superior, and all that matters is actual games and how they work, and the V5 is clearly the best choice for UT then it must work much better then the GeForce right?

FXT1 doesn't work with pre-compressed S3TC textures in OpenGL. That would be illegal being that 3dfx doesn't have a license for S3TC in OpenGL. Someone is working on converting all the textures to FXT1 so don't worry.

FXT1 isn't a hack BTW, it is a much more thought out texture compression sceme. It has 4 modes that even allows the compression of transparencies. BTW if you use WickedGL with the V5 in Q3, the texture errors aren't present. And the FPS actually goes up.

So breaking their drivers is now a good thing? Crippling proper support to deal with someone elses half @ssed code is now a plus? If they cripple their drivers for one game, problems may well pop up in others much like other companies.

Sorry it doesn't have to be something that is alway on. You can have it as an option, much like that drivers for the S3 cards.

So now are you saying that Q3 is a Half-@ss game, yet is it the #1 benchmark to support the GTS, even though the GTS is the one that looks inferior in the game. Funny how that is.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
"Don't try to convince everyone else how "bad" nVidia are."

Oh please. To infer that that was my intention, is a complete lie. I've said over and over that the GeForce 2 cards are excellent pieces of hardware. If someone were to purchase one, they certainly wouldn't be unhappy (assuming they didn't have any incompatability problems - software or hardware related.)

By the way... calling me names such as "zombie" only shows your immaturity, BFG.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Ben Skywalker, you quoted me, when I think you meant to quote IBMer (about the S3TC vs FXT1)... And, QUIT SPELLING MY NAME WRONG!!! (please note the winky-face)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
IBMer-

"FXT1 doesn't work with pre-compressed S3TC textures in OpenGL. That would be illegal being that 3dfx doesn't have a license for S3TC in OpenGL. Someone is working on converting all the textures to FXT1 so don't worry."

Is that the current excuse? You are trying to have it both ways. I don't fault 3dfx for not having support, I fault Epic. There is no reason for them not to enable DXTC which would work for all the boards(even Matrox). I was trying to illustrate a point. The fact is that right now the V5, in a game situation, can't run the S3TC textures because of an external factor, sound familiar. The GF and Radeon boards right now run UT with better visual quality because of sloppy Epic code, new postion for 3dfx owners to be in.

"FXT1 isn't a hack BTW, it is a much more thought out texture compression sceme. It has 4 modes that even allows the compression of transparencies. BTW if you use WickedGL with the V5 in Q3, the texture errors aren't present. And the FPS actually goes up."

DXTC/S3TC also allows compression of transparancies. The problem is that id isn't handling them properly in Quake3. Haven't seen anything about all compression artifacts being removed by the WickedGLs, do you have a link(honest curiosity, I want to see how they are doing it).

"Sorry it doesn't have to be something that is alway on. You can have it as an option, much like that drivers for the S3 cards."

I was turning TC off on my DDR for quite sometime, what do you mean? Have an option for a hack to fix a known bug that id has already admitted to? What happens when they fix the bug?(I do have faith that id will fix it).

"So now are you saying that Q3 is a Half-@ss game, yet is it the #1 benchmark to support the GTS, even though the GTS is the one that looks inferior in the game. Funny how that is."

The code for S3TC is sloppy. I could gladly post a very lengthy list of all the problems the UT engine has by comparison(and many are admitted to by Epic)

To significantly improve the visual quality of the sky for GeForce based boards it took changing a whopping 2 bytes of code. 2 bytes. Am I saying that Quake3 is a poor engine? He!! no, just Quake3 as it stands now has poor support for S3TC. S3 boards exhibit the same problems the GeForce based boards do, and they created the standard. Who do you honestly think is at fault? The creators of the standard has problems, but a game that came along well after the fact is flawless? S3 wouldn't have issues if it was nVidia's hardware or drivers at fault.

ATi has to disable compression for one certain size texture, why is that? If the problem is with nVidia, why does ATi also have to work around it? Why do S3 boards exhibit the same problems as nVidia boards? That is a lot of explaining to do to try and lay claim that somehow it is nVidia's problem and not id's.

Edit-

Wingznut-

Oops, sorry about that, name too(I never noticed I was spelling it wrong until now, and I type it quite a bit too)
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
DXTC/S3TC also allows compression of transparancies. The problem is that id isn't handling them properly in Quake3. Haven't seen anything about all compression artifacts being removed by the WickedGLs, do you have a link(honest curiosity, I want to see how they are doing it).

Being that I don't have a V5 anymore, I can't test it out. It am refering to the orginal discussion on B3D, that everyone was having when it was first found out that ATI was using the drivers to correct the ID mistake. Reverend posted that WickedGL gets rid of the lightmap issues in Q3. I trust Reverend.
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
I was turning TC off on my DDR for quite sometime, what do you mean? Have an option for a hack to fix a known bug that id has already admitted to? What happens when they fix the bug?(I do have faith that id will fix it).

I know I get 65fps at 1024x768x32 MAX w/ Anisotropic Filtering and Texture Compression off in Q3. Where does this lead the fastest card out there?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wingznut-

"can I call you Benny?"

Hehe, that is what most of the guys I work with call me

IBMer-

I hit about 55FPS with the same settings you have listed with my DDR(including anisotropic enabled through the display properties which, btw, is different then trilinear filtering).
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
I hit about 55FPS with the same settings you have listed with my DDR(including anisotropic enabled through the display properties which, btw, is different then trilinear filtering).

Actually your incorrect.

8-tap Anosotropic filtering is TRUE trileaner filtering.

NVidia uses Mip Map Dithering to simulate tri-linear filering, but only 8 sample is true trilinear filtering.

My FPS was with my GTS un-overclocked BTW.
 

GaryTcs

Senior member
Oct 15, 2000
298
0
0
I guess that my problem with the NVidia boards is that you have to fix the problems. they aren't small little one game or one scene incompatibilities either, they're whole graphics engine, or chipset incompatibilities. the fact that someone smart can eventually find a way to work around this is no excuse for the original error.
They put out a decent product with very impressive possibilties, but then don't live up to them. No doubt it is a faster card, but does playing 75% of your games a little faster make up for not being able to play 25% of them? Especially when framerates of ATI and 3dfx are acceptable in all the games, and run on all platforms.Is this a downfall of releasing a new product every six months?

My V3-2000 installed (in 99)with the neat little "install" button on the installation program, My Nvidia opened up a nasty little can of worms that still haunts my computer sometimes today. My buddy who bought it for use on his computer could not even install it- he couldn't post with it. This was later found to be a conflict with the ALI chipset, which cannot be fixed. I bought it from him thinking I could do better, now I think a V5, or Radeon might just do the trick. For those who say " why would you get a ALI based board?", I ask, why should I have to worry about every piece of hardware I own being incompatable?

I love it when I hear you guys going into the cards architecture and explaining the software downfalls. The ONLY thing that matters is the output. Make a product that runs all games at an acceptable framerate, doesn't make windows go postal on me, and installs easily, and I'm a happy camper. I don't give a rip if it plays Quake at 200fps, cause I can't appreciate the improvement from 100.

 

Dulanic

Diamond Member
Oct 27, 2000
9,951
570
136
Ummmm are you smoking crack? NVidia is the ONLY company right now with drivers that result in no FPS loss in Win2K. If you are actually serious in saying this go try ATis Win2K drivers LOL.
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
Ummmm are you smoking crack? NVidia is the ONLY company right now with drivers that result in no FPS loss in Win2K. If you are actually serious in saying this go try ATis Win2K drivers LOL.

Sorry, that is in OpenGL NOT D3D

People loose around 2000-3000 points in 3DMark2000 going to Win2k.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
First of all, don't get me wrong: I LOVE UNREAL. I just got the expansion pack and it totally rocks. All I am saying is their engine is based around Glide and is not tweaked for the rest of the APIs as much. Also the engine is simply not as fast or as good as the Quake 3 engine.

Oldfart:

Then why does it play so well on a Radeon in D3D or OpenGL? Totally smokes my V3 3000 in Glide.

I'm glad to hear it does. It probably means it will run well on my MX when I get it.
For now though, Direct 3D runs slower than Glide on my V3.

IBMer:

Go ahead and call me a 3dfx zealots, then explain why I own a Asus V7700 64MB Pure

What does that prove? I own a V3 but I hate 3dfx.

I don't care what review you were talking about. In the Sharky Extreme review, the V5 and V4 were proven to be more accuarate than the Microsoft Software Renderer that was used as a reference. This makes you comment absolute FUD.

Pardon me, but I always thought Sharky weren't the only review site around.

Are we going to bring up problems with the Riva128 to prove a point about the GTS... learn some rhetoric.



Has a review ever had a mistake.... Take a look at Anandtech's review.

That's funny. Two paragraphs ago you were telling me all you care about is Sharky's review and you denouced the others. Now when it's convenient you ignore Sharky and go somewhere else to back up your arguments. So which is it? Are we using other reviews or sticking just to Sharky?

If you're going to argue at least learn how it's done, otherwise you just come off looking like a dumbass.

Also how can you deny that it even SAYS it on the 3dfx website and on the box that it has: "DVD hardware assist: planar to packed-pixel conversion"

The Voodoo3 box says: 32 bit texture rendering pipeline, bumpmapping and tri-linear mip-mapping. Yet in real tests it fails every one of them. Have you ever heard of marketing BS? Because 3dfx are really good at using it.

How many people enalbe the 8-tap Anisotripic filtering(the True Tri-linear filtering) durning benchmarks. That is a loss of 5fps for the GTS. I happen to think uping the LOD Bias looks better.



So you think cards should only have features that are enabled in benchmarks? Riiiiight.

The GTS can't do Environment Mapped Bumpmapping. Hmmmmm I does do Dot Product Bump Mapping which I don't think looks as good.

I think T&L is good so why haven't 3dfx used it? That would make 3dfx inferior, according to your own logic.

T&L isn't needed in games yet. When it is. ALL card manufuacters will have a real T&L engine and not the CAD T&L engine that are present now.

T&L is enabled in benchmarks isn't it? From your previous benchmark statement that would qualify it as being a necessary feature, wouldn't it?

Do you really think the GTS is going to support High Polygon games well? Try the X-Isle demo... It looks real pretty. When it hits 2million Polygons/s the fps goes down to 20-30fps.

It supports them a hell of a lot better than the V4/V5 does.

What happened to the 20-25Million Polygons that it is supposed to be able to do? Oh wait that is just marketing right? Just like the 1.6GigaTexels....

It's called memory bandwidth limits. You should know because the V5 suffers more of it than the GF2 does becasue of texture duplication and multi-processing overhead.

You know what I think you are? I think you don't have a clue what you are talking about. You just tag along with the guys at Beyond3D and then come in here and regurgitate their arguments. The problem with this is that you take their arguments way out of context and you just come off looking like an idiot. So do us a favour and stop pretending to be an expert.

I don't claim to be an expert like BenSkywalker, but even I can see your arguments are flawed and illogical.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< Nvidia's Win2K drivers really suck >>


what are you smoking?
why don`t you compare nvidia's drivers to everyother competitors.
 

Howard

Lifer
Oct 14, 1999
47,982
10
81


<< Would a ATI Radeon be a better choice for games like this anyone any experience with this card. >>



For games like U/UT and Deus Ex (all of which run on Glide), the best choice would be the 3dfx Voodoo 5 5500. If you don't want to waste so much money, get a Voodoo 4 4500 (not exactly the best bang-for-buck card), or a Voodoo 3 2000/3000. You'd probably prefer the V3 2K because with a fan, it can overclock to past 3K clock speeds.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |