CrossFire and SLI frame rates do not reflect reality because of lack of synchronization!

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Blazer7

Golden Member
Jun 26, 2007
1,117
5
81
Originally posted by: BFG10K
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.

I partially agree with you. I think that it is possible to slow down the ?faster? frames but without reducing the overall frame rate, at least not by that much that it will make a difference. I suppose that there will be some need for ?extra? buffering as the fast frames will be held in mem for a while longer and this will in turn increase the input lag but if the frames are displayed in more uniform intervals, that will still be better for the end user as the visual experience will improve.

As apoppin mentioned everything is a compromise. Losing a few frames in order to get rid, or minimize microstutter beyond the point of perception, sounds like a good compromise to me.

I really hope that nVidia and ATI do something about this.
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.

The point is that the primary cause of the micro stutters is the fact that the CPU need significantly less time to prepare a frame than the GPU needs to render it. The more even the load distribution is shared by the CPU and GPU, the less micro stuttering will be noticeable.

If the game is primarily GPU limited, the second GPU gets the data so fast that the actual point of time of the frame is placed at is just a few milliseconds behind the frame rendered by the first GPU, like this for example (the line representing the flow of time and the x representing a rendered frame):

x-x--------x-x--------x-x--------x-x--------x-x--------x-x--------x-x--------

If you delay the CPU for about half the time it takes the GPU to render the frame (a little less actually), the frame rate wont be affected much, but the frames are spread out much more regularly (^ representing the added delay):

x------x------x------x------x------x------x------x------x------x------x------
....^^^

The difficult part is, that this must be highly dynamic, usually the delay needs to be added for one frame only (because the following frames will always follow the rhythm) until the scene changes and the frames get uneven once again.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: nRollo
Originally posted by: BFG10K
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.

I'll take anything I've experienced with AFR over single card any day BFG, here's why:

50fps average at COH with Quad is more playable than 23fps average with a 8800U

60fps average at Oblivion with Quad is more playable than 20fps average with a 8800U

98fps average at HL EP2 with Quad is more playable than 43fps average with a 8800U

139fps average at COD4 with Quad is better then 44fps average with a 8800U

41fps average at Crysis with Quad is better than 19fps average with a 8800U

People buy multi GPU sets to have higher image quality BFG. Single cards aren't even playable at all in 3/5 benchmarks I linked to above.

You can talk about barely perceptible fluctuations in caused by load balancing all you like- but the fact of the matter is multi card delivers a far better gaming experience.

What can anyone even say to this? That it's better to count the frames at 20fps than look at fluid animation at 60fps?


Edit:From the reviews I've seen of CrossfireX, this is the way of ATi hardware as well.

If the stutter bothers you, then it doesn't, and the stutter is what we're discussing.

I haven't seen anyone in this thread suggest that Crossfire/SLI doesn't deliver higher framerates than single cards (and thus allow you to play with higher iq), but that's all you keep saying.

OP: You get stutter with Crossfire/SLI

Rollo: but it's faster!

*nobody argues with this, and the thread goes back to discussing the issue at hand*

Rollo: Is nobody listening to me? It's faster!

*nobody argues with this, and the thread goes back to discussing the issue at hand*

Rollo: but, but, it's faster!

We get it already.


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dug777
Originally posted by: nRollo
Originally posted by: BFG10K
Please keep us informed of your progress with this micro stutter issue - especially that program that "evens" it out;
I surmised this was a possible solution a few pages ago but to be honest I can't see how it would work.

If the delayed frames are caused by driver the app can't speed them up, all it can do is slow down the faster frames and make their durations more in line with the slower frames.

Of course doing that will significantly reduce the average framerate and if there?s any buffering involved it?ll increase input lag which is already a problem with AFR.

I'll take anything I've experienced with AFR over single card any day BFG, here's why:

50fps average at COH with Quad is more playable than 23fps average with a 8800U

60fps average at Oblivion with Quad is more playable than 20fps average with a 8800U

98fps average at HL EP2 with Quad is more playable than 43fps average with a 8800U

139fps average at COD4 with Quad is better then 44fps average with a 8800U

41fps average at Crysis with Quad is better than 19fps average with a 8800U

People buy multi GPU sets to have higher image quality BFG. Single cards aren't even playable at all in 3/5 benchmarks I linked to above.

You can talk about barely perceptible fluctuations in caused by load balancing all you like- but the fact of the matter is multi card delivers a far better gaming experience.

What can anyone even say to this? That it's better to count the frames at 20fps than look at fluid animation at 60fps?


Edit:From the reviews I've seen of CrossfireX, this is the way of ATi hardware as well.

If the stutter bothers you, then it doesn't, and the stutter is what we're discussing.

I haven't seen anyone in this thread suggest that Crossfire/SLI doesn't deliver higher framerates than single cards (and thus allow you to play with higher iq), but that's all you keep saying.

OP: You get stutter with Crossfire/SLI

Rollo: but it's faster!

*nobody argues with this, and the thread goes back to discussing the issue at hand*

Rollo: Is nobody listening to me? It's faster!

*nobody argues with this, and the thread goes back to discussing the issue at hand*

Rollo: but, but, it's faster!

We get it already.

Sorry to post some benches that show what real stuttering is about in the thread about micro stuttering Dug- seemed applicable.

This thread is not representative of multi card gaming performance, a reader might think "Holy Cow! With that 1.4ms average variance in frame to frame refresh all multicard gain is lost! Oh noes- yet another example of corporations stealing our hard earned money!"

Which is a total crock.

I game with multicard solutions every day of the week at settings 99% of the world wish they had. Every now and again I can notice a hitch in framerates, but not 95% of the time.

Is it due to the AFR load balancing, my game setting exceeding my VRAM, something else?

Don't know, don't care. Wouldn't go back to looking at single card resolutions and settings at all- I'd move my ass over 10' and play my son's 360 on a 47" screen.

My neighbor was over a couple nights ago, I showed him UT3 at 25X16. (he's not a gamer) I played a whole round of vehicle CTF while he watched and at one point during my demo he said "This is just amazing how smooth this runs- I've seen people play computer games before and they were all jerky".

Apparently he didn't notice the "micro stutter" my guess is he wasn't looking hard enough for it, and that's kind of the point- if you're gaming at settings unachievable with single card, and the difference in animation isn't readily apparent, you win.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
it's a balance and a compromise

some of us HATE micro stutter just as some noise-princesses cannot stand a WHISPER out of their case

MOST of us "put up with" micro stutter because we cannot stand STUTTER and a slide show .. and we'd rather not turn down details so a single-GPU is not struggling. And if we can add some multi-GPU AA, then micro stutter goes away .. easily solved for me at 16x10 or 16x12



you guys that cannot stand micro stutter - too bad
.. i am so sorry for you guys
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I understand your point, and to be honest I'd have to play with a SLI/Crossfire rig over a range of games before I could provide any kind of meaningful feedback as to whether it would annoy me to the point where I was prepared to lose IQ and go back to a single card solution.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: dug777
I understand your point, and to be honest I'd have to play with a SLI/Crossfire rig over a range of games before I could provide any kind of meaningful feedback as to whether it would annoy me to the point where I was prepared to lose IQ and go back to a single card solution.

exactly .. and until you drop a few bills on a Multi-GPU setup - and are experienced with it and tweak it for a couple of months - you will not know how you will like it or not

evidently *some* of us love it [apoppin/Rollo/Keys] and some of us hate it [OP/BFG/nitro]

apparently far more of us love it than hate it



http://endeavorquest.net:8880/...mes/WIC_frametimes.htm

that doesn't look too bad^^
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Grestorn
The point is that the primary cause of the micro stutters is the fact that the CPU need significantly less time to prepare a frame than the GPU needs to render it. The more even the load distribution is shared by the CPU and GPU, the less micro stuttering will be noticeable.

If the game is primarily GPU limited, the second GPU gets the data so fast that the actual point of time of the frame is placed at is just a few milliseconds behind the frame rendered by the first GPU, like this for example (the line representing the flow of time and the x representing a rendered frame):

x-x--------x-x--------x-x--------x-x--------x-x--------x-x--------x-x--------

If you delay the CPU for about half the time it takes the GPU to render the frame (a little less actually), the frame rate wont be affected much, but the frames are spread out much more regularly (^ representing the added delay):

x------x------x------x------x------x------x------x------x------x------x------
....^^^

The difficult part is, that this must be highly dynamic, usually the delay needs to be added for one frame only (because the following frames will always follow the rhythm) until the scene changes and the frames get uneven once again.

First of all, thank you for all of your hard work on nHancer. nHancer is the app that NVIDIA should have made themselves. :beer:

Now the questions...

- Is it always the second gpu that causes micro stutter? Basically, does the first gpu render at relatively consistent intervals, while the second one is erratic?

- If so, is micro stutter always a function of the second gpu rendering 'too soon', or does it vary?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
it would seem like it .. the first GPU is the standard .. the tic tic of the metronome if you will
- the 2nd one synchs with it or attempts to do so with AFR



right? [simplistic, yes]
 

daction

Senior member
Nov 18, 2000
388
0
0
I'm new to SLI and have noticed something like this microstutter in Crysis but I can definitely attest to Bioshock running amazingly good in SLI. With a single 9600gt at 1920x1200 w/ 2xAA I was getting from 20-30fps in many parts but when enabling the second 9600gt the action is as smooth as butter and my fps stays at 55-60, so it's basically a full 2x jump in smoothness.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
The point is that the primary cause of the micro stutters is the fact that the CPU need significantly less time to prepare a frame than the GPU needs to render it. The more even the load distribution is shared by the CPU and GPU, the less micro stuttering will be noticeable.
The problem you describe equally applies to single GPUs (e.g. you can hit a GPU limited scene where suddenly the CPU is ready much quicker relative to the GPU than it was the previous frame). Single GPUs have a fluctuating framerate but it?s obviously not as bad as AFR systems.

My theory is that micro-stutter is related to timing and synchronicity issues the driver has to deal with in order to keep multiple GPUs load balanced under AFR.

This is especially true if there are dependencies between GPUs such as render-to-texture operations done on one board that are then expected to be re-used in subsequent frames, but the other GPUs don?t have them. Those kinds of dependencies are not present on a single GPU.

If you delay the CPU for about half the time it takes the GPU to render the frame (a little less actually), the frame rate wont be affected much, but the frames are spread out much more regularly (^ representing the added delay):
Your charts aren?t accurate or to scale.

Using figures if we have something like this:

1 10 ms
2 12 ms
3 22 ms
4 24 ms
5 34 ms
6 36 ms

To even that out we need 6 ms between each frame which requires a 4 ms delay on every even-numbered frame:

1 10 ms
2 16 ms
3 22 ms
4 28 ms
5 34 ms
6 40 ms

The first example renders 6 frames in 36 ms while the second renders 6 frames in 40 ms, or an 11% increase in total rendering time; not huge but the question is whether IHVs will find that acceptable.

Of course depending on the size and frequency of variances and how many total frames are rendered, YMMV.

The difficult part is, that this must be highly dynamic, usually the delay needs to be added for one frame only (because the following frames will always follow the rhythm) until the scene changes and the frames get uneven once again.
Delays like that could also wreak havoc with input response if the game tick is still updating data while external rendering delays are being put in place which the game doesn?t know about.

Also, the more GPUs are used, the less pronounced is the micro-stuttering effect. I.e. in situation where you get micro-stutters with 2-way SLI, there's a very good chance, that the same scene works fine with Quad-SLI under Vista.
Why would more than two GPUs make a difference? If GPU 1 and 2 is experiencing micro-stutter (e.g. 10 ms to render on GPU 1 and 2 ms to render on GPU 2) what bearing does GPU 3 or 4 have on the situation?

GPU 3 could render in 2 ms (which would have no extra adverse effect) or it could render in 10 ms and cause micro-stutter of its own relative to GPU 2.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Sorry to post some benches that show what real stuttering is about in the thread about micro stuttering Dug- seemed applicable.
No, they aren?t applicable. We know AFR is faster than single card. We know it can provide playable framerates over a single card. That isn?t under debate. You don?t have to keep providing benchmarks to prove something that isn?t under question.

What is under debate is at what cost those extra frames come at and whether that cost is worth it over a single card.

This thread is not representative of multi card gaming performance, a reader might think "Holy Cow! With that 1.4ms average variance in frame to frame refresh all multicard gain is lost! Oh noes- yet another example of corporations stealing our hard earned money!"
If someone was handing me hardware on a silver platter I might well be more lenient towards potential issues.

But if I?m going to fork over hard-earned cash on these expensive setups I deserve to know about potential problems and others deserve the same. The evidence has been presented and they can make their own decisions.

I?ve experienced micro-stutter and it?s one of the reasons I don?t want to commit to AFR.

My neighbor was over a couple nights ago, I showed him UT3 at 25X16. (he's not a gamer) I played a whole round of vehicle CTF while he watched and at one point during my demo he said "This is just amazing how smooth this runs- I've seen people play computer games before and they were all jerky".
Wonderful, so you provide an example of someone not even playing the game who also appears to be a layman with regards to gaming? He didn?t notice micro-stutter so it?s not an issue?

You have 30? Dell don?t you?

http://www.behardware.com/arti...5/dell-3008wfp-hc.html

The newest version of that display has an input lag of about 3 frames. If you aren?t bothered by that then it?s no wonder you aren?t bothered by micro-stutter. That?s fine (I?m personally not bothered by tearing without vsync for example) but some people are bothered by these things and deserve to know about them.
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10K
The point is that the primary cause of the micro stutters is the fact that the CPU need significantly less time to prepare a frame than the GPU needs to render it. The more even the load distribution is shared by the CPU and GPU, the less micro stuttering will be noticeable.
The problem you describe equally applies to single GPUs (e.g. you can hit a GPU limited scene where suddenly the CPU is ready much quicker relative to the GPU than it was the previous frame). Single GPUs have a fluctuating framerate but it?s obviously not as bad as AFR systems.
No, because even though on a single GPU system, the setup of the first two frames might be too close (because of the prerender) just like on a multi GPU system, the single GPU has to render those two frames in sequence, one after the other, and it requires about the same amount of time to render each of them. Which causes all following frames to be evened out automatically.

This doesn't happen with an 2 GPU SLI system, because the second frame will actually be rendered by the second GPU in parallel to the first GPU, i.e. it will be finished with exactly the same time difference the two GPUs were fed with data. So, if the scene doesn't change, this misalignment of the time differences before each odd and even frame will be sustained indefinitely.

Originally posted by: BFG10K
My theory is that micro-stutter is related to timing and synchronicity issues the driver has to deal with in order to keep multiple GPUs load balanced under AFR.
It's just not that complicated if you really think about it. It is as I explained it.

Originally posted by: BFG10K
Your charts aren?t accurate or to scale.
They aren't meant to reflect the correct scale, they're just there to illustrate the problem.

Originally posted by: BFG10K
To even that out we need 6 ms between each frame which requires a 4 ms delay on every even-numbered frame:
No, the additional delay has only to be added once. After that, the frames will be automatically placed correctly (if the scene doesn't change), because each GPU takes about the same time to render a frame.

And because of that, this algorithm wouldn't reduce the frame rate significantly.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
the second frame will actually be rendered by the second GPU in parallel to the first GPU, i.e. it will be finished with exactly the same time difference the two GPUs were fed with data. So, if the scene doesn't change, this misalignment of the time differences before each odd and even frame will be sustained indefinitely
that's all?
:Q

that really doesn't seem that hard to fix .. if you think about it - even using the prerender .. or not?
-several possible ways to minimize it, i would think



i dunno .. perhaps i haven't caught up on my sleep this week yet. I got more continuous hours of sleep last night - since midnight [to 6:30 am] - then i got in the sum of the previous days/nights since Sunday ..
--i hate when someone in my family gets really sick
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: apoppin
the second frame will actually be rendered by the second GPU in parallel to the first GPU, i.e. it will be finished with exactly the same time difference the two GPUs were fed with data. So, if the scene doesn't change, this misalignment of the time differences before each odd and even frame will be sustained indefinitely

that really doesn't seem that hard to fix .. if you think about it - even using the prerender .. or not?
-several possible ways to minimize it, i would think
well, yeah, it should be fixable, but the algorithm has to take into account that the scenes usually are very dynamic. So it must not react on sudden changes in the rendering time, but still be fast enough to prohibit the stuttering if the scene remains stable for some time.

It's not really that trivial.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Grestorn
Originally posted by: apoppin
the second frame will actually be rendered by the second GPU in parallel to the first GPU, i.e. it will be finished with exactly the same time difference the two GPUs were fed with data. So, if the scene doesn't change, this misalignment of the time differences before each odd and even frame will be sustained indefinitely

that really doesn't seem that hard to fix .. if you think about it - even using the prerender .. or not?
-several possible ways to minimize it, i would think
well, yeah, it should be fixable, but the algorithm has to take into account that the scenes usually are very dynamic. So it must not react on sudden changes in the rendering time, but still be fast enough to prohibit the stuttering if the scene remains stable for some time.

It's not really that trivial.

it sounds like a delay algorithm - dependent on the scene's dynamics . . . is it that hard to write?
[ i have verrry limited "programming" 'experience' with ... like "pong"; i realized long ago it is not what i am good at]

--back in the eighties





edit: MS [micro-stutter - we have an abbreviation!] is really not so evident on my HD2900 CrossFire when Xfire AA is enabled. How can alternate rendering like that be enabled - perhaps without actually enabling AA - and thus keeping the performance but losing the AFR MS?
- maybe i need sleep i am not making sense
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: apoppin
it sounds like a delay algorithm - dependent on the scene's dynamics . . . is it that hard to write?
"hard" is very relative...

First of all we have to convince ATI and nVidia that they HAVE to take action soon.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Oh that

why do we even need them?


Stop buying multi-GPU .. that is the best message
- their bottom line is clearest to their bean counters .. send a message and vote with your wallet

i bet AMD fixes it first

"NVIDIA announces the end of Multi-GPU MS" the next month
-- btw, i am dumping Crossfire this year

[and, i "specialize" in Manufacturer "politics" here and silly analyzes and predictions .. an occasional "serious" review.]

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
No, the additional delay has only to be added once. After that, the frames will be automatically placed correctly (if the scene doesn't change), because each GPU takes about the same time to render a frame.

And because of that, this algorithm wouldn't reduce the frame rate significantly.
The only time the scene won?t change is when you?re standing still, not touching the mouse to look anywhere and nothing dynamic is happening in the background (e.g. no grenade is going off).

In other words this is a totally unrealistic situation to expect in regular gameplay; things will change which?ll cause the need for variable delays to constantly be added.

Like I said earlier I have doubts this could offer a robust solution to micro-stuttering without adverse negative impacts elsewhere.
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Originally posted by: BFG10KThe only time the scene won?t change is when you?re standing still, not touching the mouse to look anywhere and nothing dynamic is happening in the background (e.g. no grenade is going off).

In other words this is a totally unrealistic situation to expect in regular gameplay; things will change which?ll cause the need for variable delays to constantly be added.

Yes, of course. I've already said that this algorithm has to be highly dynamic and that it's not so simple.

Anyway, the delay doesn't need to be added constantly, just every now and then.

If the scene is highly dynamic, the micro stuttering isn't much of an issue anway, because in such scenes the framerates are fluctuating as are the delays caused by the CPU and GPU, and this will conceal the problem completely. Only in slowly moving scenes the user will notice the stuttering and those scenes are exactly the scenes which can be fixed easily by adding a small delay every now and then.

I don't really understand your rejecting attitude against this solution.


 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
I don't really understand your rejecting attitude against this solution.
In my experience when you start adding external delays to a total rendering system (e.g. vsync) something else suffers and it's usually input response. AFR already has a problem in that department.

Now it could well be that you're right and it'll have very little impact elsewhere, but the devil is in the details.

In the application you referred to, do you know where in the pipeline it adds the delay?

Does it prevent the application from generating/sending frame data to the GPU until the delay has passed?

Does it allow the application to send frame data but prevents the GPU from working on it until the delay had passed?

Or does it simply buffer up finished frames and release them to the display only after the delay has passed?

Furthermore, has this application been confirmed to work in real games, or is it just a simulation of some kind? Can you provide more details about it in general?
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
The small tool I mentioned previously is not really a good solution due to various reasons. One of them being that its developer has disappeared completely and that no one but him has the source code.

It just measured the frame times and dynamically evened out the frames by adding a statistical delay before the driver returned the "release" call.

It worked quite well for a hack, which is enough for a prove of concept. But the real solution has to be implemented in the driver itself.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Grestorn
The small tool I mentioned previously is not really a good solution due to various reasons. One of them being that its developer has disappeared completely and that no one but him has the source code.

It just measured the frame times and dynamically evened out the frames by adding a statistical delay before the driver returned the "release" call.

It worked quite well for a hack, which is enough for a prove of concept. But the real solution has to be implemented in the driver itself.

i think you already made the most important point -

Anyway, the delay doesn't need to be added constantly, just every now and then.

Exactly

and is there any way to use Crossfire AA - not AFR - without the performance hit of actually 'enabling' AA?
 

Grestorn

Junior Member
Apr 23, 2008
13
0
0
Crossfire-AA doesn't have the micro stuttering problem because the cards render each frame sequentially, just like a single card. Both cards are just rendering the same frame with a small offset and the pictures are then blended together, resulting in the AA effect.

Crossfire-AA without AA is just like disabling CF completely.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |