nVidia Q&A posted

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
Fandu, the results of default testing are already in.. that's what we (should) see in all the reviews we currently have.

second, it's impossible really to compare individual cards based on different chips etc, because they each have their own quirks.

the scientific method does not say, compare results when you've done as many things you can to fix a problem.

in this case, to compare different video cards (3dfx V5 vs a Geforce GTS) is very hard to do, becuase the GTS and VSA-100 are totally different designs.

this is why reviewers just stick to the default settings, it's easy, and that's what normal people should get.

it may very well be that 3dfx owners CAN get better speed, simply by enabling an option that if enabled on the competing card, wouldn't result in the same performance increase (hmm I wonder why, could it be they're DIFFERENT?).

"The majority don't run at 640x480 low quality (I sure didn't spend $300 to run at crappy resolutions.) Nobody plays 3dMark for fun."

all you really can do then, is to enable all performance options on BOTH cards, and then compare the results. trying to find WHY is nearly impossible as I've found if you don't know everything about each architecture.
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
I think some people in this thread are acting incredibly rude for now reason.

I agree with Dave in the fact that in just using DRIVER OPTIONS you can speed up the V5 by a lot.

When speaking of driver options... the default for the GTS is to have D3D in the 2x2(low detail) FSAA setting.

Now do you think Reviewers should run D3D benchmarks at this DEFAULT setting????

That would make the V5 much faster than the GTS.

The V5 just offers more driver settings that Nvidia.

He isn't talking about using an external program to "tweak" the V5. Just use the in drivers settings like let you change things, like turning on Anisotropic Filtering on a GTS.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
Look. There are no NVIDIA drivers settings that increase performance
without hurting image quality. The only thing there is is adjust the
performance/quality scaler. That is all. It isn't like we are trying to
make 3dfx win the benchmarks. It wasn't that at all. We just saw hey, they
give us these options. Why not use them? Anyone who plays games would be
using them. They increase performance and you can't see a lick of
difference with them enabled. So use them. NVIDIA doesn't have those
options. So we can't enable them there. That is unfortunate, but it isn't
something that should be help against 3dfx. It simply does NOT make sense to
not use the features. Now if these were features that you had to go into
the registry to use, or that you had to install third party software to get
at, I would COMPLETELY agree with you. That would be blatently wrong.
However, this is not the case. This is no more tweaking something than
going on an enabling FSAA is. And that was my initial point. By the
definition you are giving me, enabling FSAA is "tweaking" and it obviously
isn't. It is simply enabling a feature.


Consider. If one card ships with FSAA enabled by default and another does
not, do you benchmark them that way in performance comparisions? No you
don't. You go into the driver options and turn off FSAA on the one card, or
turn it on with the other card. After that you compare performance.

Another way to look at it. You get two video cards and you are going to
compare them. One disables v-sync by default and the other does not.
Should you benchmark them both that way, or should you turn off v-sync on
the other one? Well obviously you should turn it off for a fair
comparision. However, this you are adjusting options on the second card.
This by your definition is tweaking and so it should not be done.

Another comparision. Say graphics card A forces 16-bit textures even when
32-bit is required (ATI did this. they still might, dunno). Is it right to
compare performance this with card B that has 32-bit textures enabled? No,
of course not. Yet by your definition you should leave the A forcing
16-bit textures because if you don't you are tweaking the card.

I hope you see what I'm getting at. It is ok to adjust things for reviews.
It isn't ok to cheat to increase performance while screwing up your image.
However, if there is no visible difference in quality, even under a watchful
eye, there is nothing wrong with it. You are simply showing the best
possible performance each card gets. That is all. There is no crime in
that.

Somebody said that the purpose of a review is to know what they are getting
when they buy a card. This is exactly it! The V5 offers something that
increase performance, and so we show this! Why? Because this is what the
person is buying! NVIDIA doesn't offer any of that stuff. So we don't show
it. Why? Because this is what the person is buying! That is fair.
Showing exactly what you are getting. Showing the best each has to offer.
Anything less would be unfair.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
I think everyone here is misinterpreting where I stand on the issue. I want everything to be the same when running benchmarks (or as close as is possible). If you tweak one card tweak the other in the same way or put up a note to say what you have done. It's ludicrous to suggest anything otherwise, and it's plain wrong to secretly tweak one card, leave the other one alone and then publish misleading and deceptive benchmarks (as was done in this review).

Any arguments I have submitted about overclocking, beta drivers different FSAA settings and such where only to illustrate the point I was trying to make about keeping everything to same and how Dave was wrong to not do so. You can ignore those points when discussing acceptable benchmarking procedure.

kombatmud:

As far as I know in order to overclock an nVidia card, you need to alter a registry setting, and nVidia does not tell you how to do that. I would not consider that an "option offered by nVidia drivers".

Well maybe no nVidia drivers, but then how many people do you know who buy nVidia's reference boards? It's the manufacturers that provide overclocking options, and I don't know a single manufacturer that doesn't ship the feature in their drivers.

If you like I could probably round up several dozen screen shots of custom drivers made by companies which show the overclocking features present. And the drivers are usually always based on the Detonators so you don't lose any performance.

This point is moot though because I am not advocating overclocking, as I mentioned above.

If on the other hand, nVidia does not offer any options, and 3dfx does, then I think the 3dfx options should be used, and a statement should be made listing all the options used, so that a person looking to buy that board will be able to reproduce the same performance level.

Exactly. If any tweaks/changes are made they should be clearly listed in an obvious place in the review. Dave's article did not do that.

Soccerman:

BFG, setting the options that came with the manufacturer's drivers, certainly isn't tweaking. I'll spare the analogies.

So for example, if nVidia don't allow the changing from a 32 bit Z-buffer and 3dfx do, you think its totally acceptable to set the Voodoo to a 16 bit Z-buffer even though that is a huge advantage to 3dfx in the benchmarks? Based on the fact that you think you can't see a difference between the two?

I think not.
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
Well maybe no nVidia drivers, but then how many people do you know who buy nVidia's reference boards? It's the manufacturers that provide overclocking options, and I don't know a single manufacturer that doesn't ship the feature in their drivers.

Being that there is very few manufacturer based driver set that uses the 6.xx drivers... do you think that would be fair in benchmarking?
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
A couple of things:

First, let me check to make certain Kristof enabled those features. Thinking back on it, I'm not even completely certian that those features were in the driver yet. Not that it matters I suspect.


And also, you talk about 3dfx setting the Z-buffer to 16-bit while others are at 24-bit? Where are you getting this? 3dfx is not doing this at all.
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76


<< want everything to be the same when running benchmarks (or as close as is possible) >>



Sorry, but that just cannot happen. These are 2 different cards that are optimized around 2 different API's and you cannot compare them having everything the same. In most hardware reviews, there is no mention that the V5's performance would be better running in its optimized API.

Eg.
Run game 1 with GTS in D3D and get 100 FPS
Run game 1 with V5 in D3D and get 90 FPS

(Hmmm this is where reviewers usually stop - what the GTS uses the V5 to mop piss? What they dont show you is the other side of the coin)

Run game 1 with GTS in Glide and get 90 FPS
Run game 1 with V5 in Glide and get 100 FPS

(We dont see that the V5 pulls off the same results in Glide as the GTS does in D3D - but that doesnt matter cause 'the V5 sucks sh!t... long live nVidia!')

The GTS runs just as Sh!tty in Glide, as the V5 does in D3D (in this example - I dont have any real numbers here). Does that make the GTS a piece of sh!t? No. It only means that the GTS is pushing as much as it can in an API it wasnt optimized for. The same way the V5 is pushing as much as it can in an API it wasnt optimized for.

(I know you need a Glide wrapper to use Glide with all cards not made by 3dfx)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Dave:

If one card ships with FSAA enabled by default and another does not, do you benchmark them that way in performance comparisions? No you don't. You go into the driver options and turn off FSAA on the one card, or turn it on with the other card. After that you compare performance.

I'm not saying to leave the drivers at their defaults. I'm saying go into both drivers and change the same item on both boards. If one board doesn't have the same tweaks available, don't change the other board or put up a note about what you have changed. It's really that simple.

and you can't see a lick of difference with them enabled. So use them.

That is purely subjective and is not an accurate measurement. Once the cards are set the same you can make observations like that because you are sure you are on even ground. As you said before, some cards might be doing something different but you won't truly know unless you have them both setup exactly the same.

Because this is what the person is buying! That is fair. Showing exactly what you are getting. Showing the best each has to offer. Anything less would be unfair.

I understand that angle and there's nothing wrong with that as long as you clearly mention exactly what you had done. Why did you not mention you had done this in the review?

And also, you talk about 3dfx setting the Z-buffer to 16-bit while others are at 24-bit? Where are you getting this? 3dfx is not doing this at all.

Dave I was just making that up as an example for Wingznut. You can ignore that one. I have seen visual artifacts with the 1.03 drivers at very long distances but that's beyond the point.

IBMer:

Being that there is very few manufacturer based driver set that uses the 6.xx drivers... do you think that would be fair in benchmarking?

I beg to differ. Almost every mainstream nVidia card maker has updated to the 6.xx series by now. But as I said before, the overclocking issue is moot so just ignore it. It was purely for example purposes.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Stefan:

The same way the V5 is pushing as much as it can in an API it wasnt optimized for

You are talking about the difference between an industry standard API (Direct 3D) and a proprietry API (Glide) from 3dfx. If the V5 is struggling with an industry standard API I want to know about it. But I don't care if nVidia's boards have trouble running 3dfx's proprietry API, and frankly I don't think anyone else does either. It's two totally different cases

Don't use both cards in Glide. Don't use one in GLide and the other in Direct 3D. The tests should be done with both boards in Direct 3D (or OpenGL).
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
I beg to differ. Almost every mainstream nVidia card maker has updated to the 6.xx series by now.

Actually Creative Labs is the only one.

They just posted their 6.34 drivers yesterday as a matter of fact. The drivers before that were 5.30s.

Nvidia hasn't released a SDK for the 6.xx drivers so companies are having a hard time setting up their own drivers based on them. This is most likely because Nvidia doesn't want people to know the difference in the 6.xx versus the 5.xx drivers.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0

I'm not saying to leave the drivers at their defaults. I'm saying go into both drivers and change the same item on both boards. If one board doesn't have the same tweaks available, don't change the other board or put up a note about what you have changed. It's really that simple.


But you could totally flip that situation. Because one card doesn't have certian options, you are holding back the card that does have the option. I mean is that really fair to make one look worse than it is because the other card doesn't have features? No, of course not. You are setting up the card with the features to look worse than it is.


That is purely subjective and is not an accurate measurement. Once the cards are set the same you can make observations like that because you are sure you are on even ground. As you said before, some cards might be doing something different but you won't truly know unless you have them both setup exactly the same.


Visual artifacts are not a subjective thing. It is not an image quality issue. It is a case where they either exist or they don't. It is like texture maps. You either have them or you don't. It isn't a case where &quot;I don't think it is really any different&quot;. It either is or isn't. It is like math. 2+2 always = 4. 2+2 doesn't equal 5 because I think it should.


I understand that angle and there's nothing wrong with that as long as you clearly mention exactly what you had done. Why did you not mention you had done this in the review?

I didn't write 97% of the thing. Kristof did. And that said, as I mentioned before, thinking back I'm not even 100% sure they were enabled. Looking at the numbers, I assume they were. However, I don't see why it is required either. You are just using both cards to their max performance.


 

BigToque

Lifer
Oct 10, 1999
11,700
0
76
How is an open sourced API that can be used by anyone, considered proprietry? You also seem to be forgetting that many developers still use and will continue to use Glide as their API of choice.
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76


<< But I don't care if nVidia's boards have trouble running 3dfx's proprietry API, and frankly I don't think anyone else does either. >>



Just because Quake 3 isnt hindered by running in OGL or D3D, there are many games that are hindered severely. The best I can think of off the top of my head is Ultima 9. That game runs like sh!t in D3D, and great in Glide. Are you sure that every single person who bought U9, and has to run the game in D3D doesnt care that if they had bought the V5 over their GTS, their brand new game wouldnt run like sh!t?
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
So BFG, if you want the cards to be as equal as possible... you are suggesting that T&amp;L be disabled on the GeForce cards, because that isn't an option on the 3dfx cards.

(NOTE: This is just an example. Can you even &quot;disable&quot; T&amp;L on a GeForce card?)


Look, personally I'd like to see exactly what the card's capable of when comparing. I said I'd spare the analogies, but I guess I lied.

When reading Road &amp; Track, I don't see them crippling Car A, because Car B has a smaller engine.
Nor do I see consumer reports not using a digital comb filter on the Sony TV, because the RCA's is analog.
Hmmm... the HP printer only does 600 dpi output. Does that mean that we shouldn't use the 1440 output on the Epson when comparing?

Again, what the card does at default settings is irrelevant. People are going to &quot;fine tune&quot; the options, so let's benchmark the hardware, as people would use them in every day gaming.
 

BurnedNIU

Junior Member
Nov 11, 2000
9
0
0
Look, i think there is a simple solution for this.

ANY Changes you make in the driver settings you MUST tell the reader,to allow the reader HIS/HER OWN OPINION on whether that setting change is worth it. Now, if you change settings and not say so (which is very bad journalism, IMHO) the reader will think that the settings were more or less the same (defaults).
Simple, i think...
 

BurnedNIU

Junior Member
Nov 11, 2000
9
0
0
Wingznut:
&quot;When reading Road &amp; Track, I don't see them crippling Car A, because Car B has a smaller engine.
Nor do I see consumer reports not using a digital comb filter on the Sony TV, because the RCA's is analog.
Hmmm... the HP printer only does 600 dpi output. Does that mean that we shouldn't use the 1440 output on the Epson when comparing?&quot;

Are you suggesting that Road and Track editors adjust the cars to their liking before they test horse power? I doubt it highly. They take the cars as they are given, the way they are sold, the way MOST people drive them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
IBMer:

Actually Creative Labs is the only one.

I can tell you that MSI are using the Detonator driver version 6.11 (6.18?). I could look for more if you really like, but I can't really be bothered. Anyway I told you: you can drop the argument because I was never advocating to overclock things in the reviews. It was just an example.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Dave:

Because one card doesn't have certian options, you are holding back the card that does have the option. I mean is that really fair to make one look worse than it is because the other card doesn't have features? No, of course not. You are setting up the card with the features to look worse than it is.

I understand what you are saying. If you do make a change inform the consumers. Take the example of T&amp;L. Most hardware sites will say something like: &quot;The GF2 MX has T&amp;L which helps it in low resolution situtations, while the Voodoo does not, and so does not score as well in CPU limited situations&quot;.

That statement has told me:

(1)
T&amp;L boosts certain things on the GF2 MX, so in similar situations as the benchmarks I can expect similar boosts. The Voodoo doesn't have T&amp;L so I won't get those boosts.

(2)
T&amp;L was enabled on the GF2 MX during the test but it was not enabled on the Voodoo because it doesn't support it. Hence T&amp;L was at least partially responsible for the measured difference.

That single statement has made a perfectly acceptable benchmark. If you want to do that for anything else go right ahead but remember to mention any changes you make. To me that is plain journalism ettiquette.

It is a case where they either exist or they don't.

In the situations you have tested. Have you played several hours on each level of Quake 3? How do you know Q3DM7 (for example) will not experience artifacts with the settings you suggest if you have never seen it?

Are you going to run back and update your article to say &quot;yeah the settings we gave you don't quite work on this level. We'll have to redo the benchmarks from scratch&quot;? I think not.

And that said, as I mentioned before, thinking back I'm not even 100% sure they were enabled.

Oh I think they were very much enabled. Unless 15 other independant hardware sites are wrong.

Stefan:

How is an open sourced API that can be used by anyone, considered proprietry?

Because it was built from scratch to work soley on the Voodoo series architecture. Whether or not it is open source is irrelevent. As I asked WingznutPez the other day, what would you you do if S3 open sourced Metal? Would you ask 3dfx to implement it on their boards? Would you happy in benchmarking S3 boards versus Voodoos in Metal mode? I think not.

You also seem to be forgetting that many developers still use and will continue to use Glide as their API of choice.

That is false. There are legacy Glide games in use but very few developers are developing for it anymore, if any at all. In fact 3dfx have considerd the API dead and are not updating it anymore. Epic have officially dumped it in their Unreal 2 engine. Glide is no longer an issue unless you play a lot of legacy Glide games.

Wingznut PEZ:

So BFG, if you want the cards to be as equal as possible... you are suggesting that T&amp;L be disabled on the GeForce cards, because that isn't an option on the 3dfx cards.

During the course of our discussions you have asked me that at least three times. Each time I have answered that testing hardware features of a video card is not the same as tweaking drivers for one and not the other, or using different APIs across the boards.

3dfx's FSAA is better than everyobody else right? I have no trouble in using that in benchmarks. Heck if 3dfx had some accelerator which accelerated FSAA three times the speed it is today I would say &quot;bring it on.&quot;

We are tesing the hardware and its features. We are not testing how well a card does after certain things have been disabled on one card which we hope will make no difference to the gameplay.

Hmmm... the HP printer only does 600 dpi output. Does that mean that we shouldn't use the 1440 output on the Epson when comparing?

When comparing print speed yes. For that they should both be at 600 DPI. When comparing image quality no. Those benchmarks done at B3D were FPS benchmarks not image quality, so the tweaks were being done at the wrong place. They could have made a separate &quot;lets see how far we can tweak each board and not lose image quality&quot;. That would have been perfectly fine.

If you want to test image quality, by all means adjust each board for maximum image quality, independent of the others if neccessary. Tweak each board to the maximum and tell me about it. Tell me how good 3dfx are with this thing. Tell me how good nVidia are with that thing. I want to know so I can make an informed decision.

BurnedNIU:

ANY Changes you make in the driver settings you MUST tell the reader,to allow the reader HIS/HER OWN OPINION on whether that setting change is worth it.

Exactly. Heck it's your review. Change whatever the heck you like and whetever you think is appropriate. Just tell us about it!!!

the reader will think that the settings were more or less the same (defaults).

Well more crucially the reader will assume the settings are the same for both cards, as is standard practice in any benchmark test. It's irrelevant what the default settings are. They should be changed for benchmarking purposes to make them the same on all boards.

Everyone:
Alright I think it's time to wrap this up. I think everyone knows where everyone stands and we just end up going around in circles arguing the same points.

My final verdict on benchmarks/reviews is:
(1) Keep everything the same as possible and avoid proprietry APIs.
(2) If you change anything for whatever reason and don't change the other cards, tell me about it and why you changed it.
(3) For image quality tests tweak each board for maximum image quality, and tell me about what you changed and how it helps.

Dave I don't hold you personally responsible for that review but I am a little dissappointed in how it was carried out. I just wanted to express my opinion on the matter. I hope you didn't take my arguments personally or the wrong way and I'd like to see you continue to post here because you really do know your stuff.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
Instead of reading/answering your post, I'll just through to use that depth precision was not set to fastest because depth precision was not even in that driver build. So this whole argument is mostly pointless.
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
&quot;I think everyone here is misinterpreting where I stand on the issue. I want everything to be the same when running benchmarks (or as close as is possible). If you tweak one card tweak the other in the same way or put up a note to say what you have done.&quot;

lol, reaed my last post..
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Dave-

&quot;They seem to be the ones with half a brain around here (and some others.. just can remember your name ). If they want me to leave, I'll leave.&quot;

Gee thanks, only half a brain Of course I think you will find that none of us want you to leave.

Review/Benches-

Beyond3D is NOT AnandTech. It is NOT FiringSquad, it is NOT Sharky's or Tom's. The audience is different, the reviews are different, and they should be.

When I read a review there, I EXPECT the boards to be tweaked, even if it requires editing the registry or using other hacks, let alone a simple control panel adjustment. I don't care if they include a how to, to me it seems an understood fact that if you are reading their reviews, you know enough to figure out the optimal setup on your own for your own rig(which very often is different depending on your particular system).

Nothing at all against any of the other sites, with them I do expect them to list what they do exactly as they are geared towards broader audiences and are aiming the article at people who may not be all that familiar with tweaking things as much.

I don't want a blow by blow and rehash everything else that has been covered by every other site, if I did then I likely wouldn't read their reviews which tend to come out a bit later then everyone else's. They set the boards up as best they can and then let them speak for themselves.

Their articles spend a great deal of time looking more in depth then the others, breaking things down and looking at what the benchmarks are telling us instead of just throwing up the numbers and saying &quot;See card X beats card Y&quot;. Things like looking at exactly where memory bandwith starts to play a role, actual effective real world fillrate versus just a FPS number(particularly handy with tiling/HSR about to become a bigger factor), looking at the level of features support and how they relate to gaming.

For the review in question, while I think it would be nice to have a link, I am assuming it was the big round up done a while back. If that is the case, both the GF2 and the V5 were using older drivers, Det2 for the GF2 and a revision prior to 1.03(?) for the V5. Both boards have since seen fairly large performance boosts.

I think that Dave can vouch for the fact that I don't always agree with their conclusions(on occasion disagreeing would be putting mildly), but the methods in which they test their hardware IMHO is the best.

If you care to check out the forums and search through the boards you will find that they come onto the boards and ask the readers what benches they think they should be using, and it often results in some good discussion on showing the relative strengths of the boards off. This is a far different attitude of the Q3/UT/3DMark2K run through that we see all too much of. Even when they do use the &quot;standard&quot; benches, they use them in a way that they mean something(3DMark2K is a very useful bench, just the 3DMark is useless).

As I stated earlier, I may not always agree with their conclusions, but I don't think that questioning their testing methodology should ever be an issue. They are the BEST in that particular area.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Alright, I am tired of all this useless talk. I want to see some numbers. Will somebody post the benchmarks for a geforce gts fully tweaked without any decrease in image quality to a v5500 fully tweaked.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
I'll be glad to run 'em.. just need to get myself out of bed...

I'll do them in a bit.
 

Dulanic

Diamond Member
Oct 27, 2000
9,951
570
136
Screw all the APIs! OpenGL baby! Its just too bad more developers dont program with it because it is a more dificult API to write for then is say D3D or Glide.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |