9600GT SLi review

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: dreddfunk
Originally posted by: apoppin
You are late to the party Dredd ... Keys and BFG and a few others are already *doing* just that

Originally posted by: Azn
#1 has been tested already if you read few pages back and already been proven that you don't need a 112SP to get near 8800gt performance.

9600gt is proof anyways for #1.

#2 I can't test because I don't have a 8800gt or 9600gt. Yes I would really appreciate if you would test it and shut BFG up once and for all.

'Late to the party', 'already been proven', hardly. All respect to Keys and BFG--as I think it was wonderful for them to do the tests to begin with--I don't believe their tests to be conclusive in any way, which is indicated by the fact that both plan further tests.

However, and this is the point of my posting, I think testing two G80 GPUs, is the wrong choice. I'm interested in their results, but their applicability to G92/G94 cards, with their different texture and memory setups, remains tentative.

The issue at hand is how a *G94* card with 64 SPs performs close to a *G92* card with 112 SPs. The answer to that question isn't going to lie in running tests on two G80 cards.

So I suggest coordinating a test between G92/G94 cards, while *controlling* for variables (such as number of shaders and texture units).

I will try to do what I can on my own, as a G92 owner, but someone with a G94 and a computer with similar specs would really need to be in the equation to *prove* anything (I'm on an e2140 @2.4, IP35-E, 3GB HP Value RAM DDR2-667 @600, BFG 8800GT OC @625, monitor up to 1920x1200).

If anyone knows of any downloadable time-demos that seem reasonable to test, I'd be more than happy to give it a go. I don't have many of the latest FPS-style games, I got out of the habit of playing them, but I'm willing to do what I can within reason (i.e. I don't have the money to go out and buy five games just to run benchmarks).

Cheers.

uh-uh - no ... i *only* stated that you were late and that others were already doing 'testing' ... don't lump me in with "Mr. Conclusive" or "Mr. Proven" ... there is no relation whatsoever


... and 'better later than never' [although to be sure it is "better never late"] ... i look forward to your perfected testing results



i know of quite a few *free* down-loadable time demos ... a few require you to have the game

off the top of my head, these do not:

HL2's Lost Coast
Lost Planet
Call of Juarez


PM me if you want more
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yes me2. so we just need to cut a 8800gt to 64sp and run it on the newer drivers at the 9600gt clock speeds and see if theres a difference vs. stock 8800gt?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Why is that Sickbeat was to downclock his SP to 800mhz
See, I don?t think he was. While you can clock the shaders separately there?s a limit as to how far you can change them relative to the core clock, and if you go too far RivaTuner simply resets the clock to the last valid one.

That?s yet another possible reason why he wasn?t seeing a difference.

Try clocking your SP to 800mhz or so and do the bench again.
I can?t. Anyway, I don?t need to.

Based on my second test changing nothing but shader clocks (i.e. texturing was not affected) I got a 13% gain from 24% more shader power, or about a 1:2 ratio.

Using the 1:2 to ratio on my first test we see that when I increased my SPs from 64 to 128 (100%) I got a 56% performance gain, so clearly 50% of that is from shaders while only 6% is from texturing.

When you diable 1/2 of SP on your ultra, your FP16 Texture fillrate is cut to 1/2 as well
But I?ve since changed nothing but the shader clock.

Why don't you provide proof instead by downclocking your SP.
Err, downclocking the shaders is proof.

Comparing 8800gt and 9600gt in bioshock the performance difference is there but not the numbers you posted by 50% reduction in frame rates.
And? They?re comparing two different cores, two different clocks speeds with two different drivers, and likely with different settings to mine.

But you didn't only demonstrate 128SP to 64SP. You cut your texture fillrate by 1/2 as well.
Which again has been demonstrated to hold true with simply downclocking the shaders.

What evidence did you bring?
What part of my results are you having difficulty reading and/or comprehending?

Let me know and I?ll try to help you out.

You didn't even have a clue your texture fillrate was cut by half when you disabled your SP and tested your games on it.
Neither did you at first. In fact you were running around telling everyone you expected a zero performance gain until I provided results to the contrary and you flip-flopped on what you were saying before.

Anyway, as my shader clock test demonstrated, the loss of texturing played a minor part in the performance hit.

Kind of like 3dmark multi-texture test doesn't matter
It doesn?t. Come on, show us developer and/or IHV commentary that states modern games will be bottlenecked in the manner your multi-texturing graph demonstrates. If you can?t you need to retract your claim.

I?ve demonstrated my side with reviewer and IHV comments that demonstrate shader bottlenecks.

and COD4 is primitive engine?
Yep. Again you don?t seem to understand the basic notion of bottlenecks and why some test-beds are not useful to prove anything, you just keep running around and screaming your ?raw texture fillrate is king? mantra.

Where was my comment about disabling 64SP have no performance impact?
Whenever someone posted they got no performance change.

I might have said little impact but I did not say NO impact.
Well you were wrong there too.

Is this why Geforce 7 is getting weak performance and x1000 does better on this game and my budget 8600gts with 128bit bus does good as 7900gtx?
I thought in the other thread you didn?t want to discuss old cards? So which is it Azn, do we talk about old cards or not?

Why don't you tell Activision that COD4 is primitive.
I don?t need to, I can see from actually playing the game that it lacks many modern features such as FP HDR and that in places it looks more dated than the first two games.

Crysis isn't just shader intensive it is video card intensive.
Riiight, and I suppose Call of Duty 4 is perfect to test shaders? :roll:

Crysis is one of the most shader intensive games ever made, if not the most intensive. It ships with 85,000 of them for heaven?s sake.
 

vanvock

Senior member
Jan 1, 2005
959
0
0
Originally posted by: bryanW1995
If the 675/2200 rumors are true, then it needs a lot of AA improvments to catch up to gtx/ultra at high res. This card could be a big flop just because it's going to be so much higher than 8800gt for minimal performance increase. even rollo hasn't been bragging on it, he's been talking about the 9800gx2 instead. That should tell us that A.) the card is weak, or B.) rollo is trapped under something heavy and can't reach his keyboard. I'm going with A.


Thanks for the laugh! :laugh:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Why is that Sickbeat was to downclock his SP to 800mhz
See, I don?t think he was. While you can clock the shaders separately there?s a limit as to how far you can change them relative to the core clock, and if you go too far RivaTuner simply resets the clock to the last valid one.

That?s yet another possible reason why he wasn?t seeing a difference.

Try clocking your SP to 800mhz or so and do the bench again.
I can?t. Anyway, I don?t need to.

Based on my second test changing nothing but shader clocks (i.e. texturing was not affected) I got a 13% gain from 24% more shader power, or about a 1:2 ratio.

Using the 1:2 to ratio on my first test we see that when I increased my SPs from 64 to 128 (100%) I got a 56% performance gain, so clearly 50% of that is from shaders while only 6% is from texturing.

When you diable 1/2 of SP on your ultra, your FP16 Texture fillrate is cut to 1/2 as well
But I?ve since changed nothing but the shader clock.

Why don't you provide proof instead by downclocking your SP.
Err, downclocking the shaders is proof.

Comparing 8800gt and 9600gt in bioshock the performance difference is there but not the numbers you posted by 50% reduction in frame rates.
And? They?re comparing two different cores, two different clocks speeds with two different drivers, and likely with different settings to mine.

But you didn't only demonstrate 128SP to 64SP. You cut your texture fillrate by 1/2 as well.
Which again has been demonstrated to hold true with simply downclocking the shaders.

What evidence did you bring?
What part of my results are you having difficulty reading and/or comprehending?

Let me know and I?ll try to help you out.

You didn't even have a clue your texture fillrate was cut by half when you disabled your SP and tested your games on it.
Neither did you at first. In fact you were running around telling everyone you expected a zero performance gain until I provided results to the contrary and you flip-flopped on what you were saying before.

Anyway, as my shader clock test demonstrated, the loss of texturing played a minor part in the performance hit.

Kind of like 3dmark multi-texture test doesn't matter
It doesn?t. Come on, show us developer and/or IHV commentary that states modern games will be bottlenecked in the manner your multi-texturing graph demonstrates. If you can?t you need to retract your claim.

I?ve demonstrated my side with reviewer and IHV comments that demonstrate shader bottlenecks.

and COD4 is primitive engine?
Yep. Again you don?t seem to understand the basic notion of bottlenecks and why some test-beds are not useful to prove anything, you just keep running around and screaming your ?raw texture fillrate is king? mantra.

Where was my comment about disabling 64SP have no performance impact?
Whenever someone posted they got no performance change.

I might have said little impact but I did not say NO impact.
Well you were wrong there too.

Is this why Geforce 7 is getting weak performance and x1000 does better on this game and my budget 8600gts with 128bit bus does good as 7900gtx?
I thought in the other thread you didn?t want to discuss old cards? So which is it Azn, do we talk about old cards or not?

Why don't you tell Activision that COD4 is primitive.
I don?t need to, I can see from actually playing the game that it lacks many modern features such as FP HDR and that in places it looks more dated than the first two games.

Crysis isn't just shader intensive it is video card intensive.
Riiight, and I suppose Call of Duty 4 is perfect to test shaders? :roll:

Crysis is one of the most shader intensive games ever made, if not the most intensive. It ships with 85,000 of them for heaven?s sake.

BFG all you do is reply with no evidence. You just reply and say yes you did. Pathetic.

While Sickbeat downclocked and even disabled 48 of his SP and saw no noticeable difference.

You can argue all you want by yourself. I'm sick of your bating troll tactics.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dreddfunk
jared - I think that's true in order to come up with a reasonably-founded guess about performance, but why stop there? Why not just use RivaTuner to scale back SPs on an 8800GT and see what we come up with. The truth is, I've no idea how different the G80 architecture is from G92, on the low level of how the SPs operate.

I kind of want to get to the bottom of whether or not the G94s SPs are much improved, or that shaders just don't matter as much as I thought. I know specs won't be identical outside of a testing lab, but I feel that we could do a lot better than what we're doing right now, 'tis all.

I'll look for the Crysis demo. Does it have a built-in time demo?


[edited for spelling...or lack thereof]

Thanks Dred. Please do test with your G92. Sickbeat already tested his 8800gt with COD4 by disabling 48 of his SP and even downclocked his SP clocks. He saw no noticeable difference.

If you did some further test with bioshock I think we can shut BFG troll with 15000 garbage posts up once and for all.

There is a timedemo in bioshock. I don't know if there is a demo however.

Crysis has a timedemo as well since BFG thinks it's shader intensive.

Disable your 48SP first and clock it to 9600gt clock speeds and shader clocks. You will see that it will perform like G94.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
BFG all you do is reply with no evidence. You just reply and say yes you did. Pathetic.
I already offered to explain my results to you but I haven't heard back.

While Sickbeat downclocked and even disabled 48 of his SP and saw no noticeable difference.
We've already been over this. Repeatedly.

Furthermore if you truly think the results from SickBeast and Keys are accurate then you'd need to admit texturing performance makes no difference in that game given they were lowing their texturing by disabling SPs.

Of course you?ve conveniently side-stepped that issue but continued to take issue with the shader side of things. These are the typical antics we have to endure from you every time this issue comes up.

You can argue all you want by yourself. I'm sick of your bating troll tactics.
Ditto.

Sickbeat already tested his 8800gt with COD4 by disabling 48 of his SP and even downclocked his SP clocks. He saw no noticeable difference.
Like I said, we've been over this before. Multiple times. The problem is your refusal to accept anything that contradicts with your texturing ?theories?.

Crysis has a timedemo as well since BFG thinks it's shader intensive.
I could test Crysis and more games but why bother? You obviously don't like seeing anything that contradicts your views so I?d simply be wasting my time.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
I have noticed Azn self-contradicting (when comparing posts from other threads) as well.
Aye, ain't that the truth.

In the other thread now he's trying to tell me my AA settings use texture fillrate (LOL!) when ironically the fact I use AA helps his cause because it strains memory bandwidth and ROPs which lessens potential differences from low shader clocks.

Of course he doesn?t see it that way.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Sickbeat saw no noticeable difference when he downclocked his SP to 800mhz or disabled 48SP. According to BFG COD4 is primitive. :laugh:

According to BFG the earth is flat too.

Maybe he'll dictate for us how his CPU scaling is revolutionary but he couldn't figure out he had reduced his all might FP 16 texture fillrate to half when he disabled half of his SP.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
I have noticed Azn self-contradicting (when comparing posts from other threads) as well.
Aye, ain't that the truth.

In the other thread now he's trying to tell me my AA settings use texture fillrate (LOL!) when ironically the fact I use AA helps his cause because it strains memory bandwidth and ROPs which lessens potential differences from low shader clocks.

Of course he doesn?t see it that way.

You need all the support you can get.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Azn & BFG, you both need to calm it down.

BFG, You took that CoD4 experiment way too seriously. It was just a quick, look see. For CoD4, at 1680x1050, utilizing only 64 shaders in a 8800GT or 8800GTS 640, had no effect. This should tell you something more important than what resolution we tried. You completely missed the "little goal" of our little test. And you carry on this way as if you believe we are all a flawed species and stupid.

Not liking it.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Wow such an interesting discussion from something as innocuous 9600 SLI performance. Looks like I'm a bit late but.....I think Keys and Azn are onto something with the differences between G80 and G92 TMU comparisons. I've felt ROPs were the biggest determining factor of performance on 8800 parts for some time but I didn't realize the TMUs were directly tied to shader speeds/clusters as it seems some of these tests indicate. I was pretty convinced TMU performance was tied to core clock but it seems core clock speeds mainly impact ROP performance.

I think its pretty clear though that NV's ingenius shader implementation is either more prolific than anyone thought or game devs/NV are holding back on shader performance. Tests here with 8800 parts with shaders disabled show performance tanking with only 32 shaders and are confirmed with horrible performance on the 32 shader 8600 parts. The 9600 and 8800 parts with 64 shaders all perform very close to one another though. When comparing parts with more than 64 shaders at similar clock speeds, performance is also very similar (G92 GT/S and even G80 GTS/GTX).

In any case, some of the observations here have confirmed what I've seen in testing my own G80/G92 parts. That the G80 scales well on both the core and shader due to its 1:2 TMU ratio, extra ROPs and high bandwidth while the G92 shows much smaller gains with core/shader increases because its TMUs are more efficient at 1:1 and bandwidth becomes the bottleneck with the 256-bit bus. Unfortunately, the way G80/G92 are designed bandwidth and ROPs seem to be tied so seeing improvement in one area without increasing the other is unlikely. The only "easy" way for NV to boost G92 performance on a 9800GTX would be to use GDDR4 to increase bandwidth but it doesn't seem like they have any plans to do that either. The harder/transistor expensive method would be to increase the bus size, which would increase the number of 64-bit memory controllers and ROPs but it looks like that definitely will not happen until G100/GT200 on a completely new high-end die.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Back to an older discussion in this thread, I've owned two NV 6-series boards (P5N-E SLI and P6N Platinum) and have personally experienced or read about some of the problems with them. Many of them I've forgotten or have been fixed with BIOS updates or Vista hot fixes but there were enough problems with the chipset that I'd only recommend using one if you're confident you'll go SLI at some point. Reading AT's Vista SP1 Article actually reminded me of some of the bigger problems with NV chipsets that have been fixed over the last few months.

But here's a few that I can remember:

  • 1) Compatibility problems with more than 2 DIMMs populated
    2) Compatibility problems with more than 2GB installed (both during install and usage)
    3) Inconsistent results with FSB speeds > 400MHz and official support of 1333MHz+ FSB. (Important for those considering Penryn upgrade path)
    4) NV USB Controller problems in Vista (hot fixed)
    5) X-Fi problems with 2GB+ (hot fixed and Creative patched)
    6) X-Fi cracking/popping on older Nforce chipsets (prevalent on NF4, still seems to be a problem on some 6-series builds)
    7) Poor SATA/IDE controller performance/compatibility
    8) Board killing RAM (personally had 3 kits die on my P5N-E, seems to have been a bigger problem on the early 680i boards)

I'm sure there's a few more here and there that I've missed but I've seen enough over the last 14 months or so to have recommended an Intel board to my brother for his latest build. If I had to do it over again I would've picked up a P35/X38 as well probably, although they weren't released until well after my initial build.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: bryanW1995
If the 675/2200 rumors are true, then it needs a lot of AA improvments to catch up to gtx/ultra at high res. This card could be a big flop just because it's going to be so much higher than 8800gt for minimal performance increase. even rollo hasn't been bragging on it, he's been talking about the 9800gx2 instead. That should tell us that A.) the card is weak, or B.) rollo is trapped under something heavy and can't reach his keyboard. I'm going with A.



That's pretty funny!

I've only read the rumors you guys have read about the 9800GTXs, haven't seen anything official about them.

 

vanvock

Senior member
Jan 1, 2005
959
0
0
I've read some posts on other boards where oc'ers said upping the shaders gave little or no gain.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Azn & BFG, you both need to calm it down.

BFG, You took that CoD4 experiment way too seriously. It was just a quick, look see. For CoD4, at 1680x1050, utilizing only 64 shaders in a 8800GT or 8800GTS 640, had no effect. This should tell you something more important than what resolution we tried. You completely missed the "little goal" of our little test. And you carry on this way as if you believe we are all a flawed species and stupid.

Not liking it.

I'm done replying to him Keys. However I want to see some more tests. If I had a 8800gt I would so gladly do it.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: chizow
Wow such an interesting discussion from something as innocuous 9600 SLI performance. Looks like I'm a bit late but.....I think Keys and Azn are onto something with the differences between G80 and G92 TMU comparisons. I've felt ROPs were the biggest determining factor of performance on 8800 parts for some time but I didn't realize the TMUs were directly tied to shader speeds/clusters as it seems some of these tests indicate. I was pretty convinced TMU performance was tied to core clock but it seems core clock speeds mainly impact ROP performance.

It is tied to core clocks however TMU is tied to SP cluster.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
BFG, You took that CoD4 experiment way too seriously.
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.

It's like me saying games aren't GPU limited and then running GLQuake at 320x240 to prove it.

It's great I ran the test but it doesn't really prove anything.

You completely missed the "little goal" of our little test.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.

Again I?m not having a go at you, I?m just explaining to Azn why he can?t use that as evidence.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
I was pretty convinced TMU performance was tied to core clock.
It is because TMUs run at core clocks, not at shader clocks. However TMUs are tied to SPs so if some are disabled you also disable some TMUs.

That?s why my second Bioshock test is so significant. There I only reduced the shader clock (so no effect on texturing performance) and was able to see similar performance scaling to when I disabled SPs (where texture performance was reduced).

That tells me texturing performance plays a minor role compared to shader performance, in Bioshock at least.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BFG10K
BFG, You took that CoD4 experiment way too seriously.
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.

It's like me saying games aren't GPU limited and then running GLQuake at 320x240 to prove it.

It's great I ran the test but it doesn't really prove anything.

You completely missed the "little goal" of our little test.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.

Again I?m not having a go at you, I?m just explaining to Azn why he can?t use that as evidence.

The "little goal" was to establish whether or not the G94 had improved architecture or not over G92. It turns out the number of shader processors is irrelevant, if they aren't used. In CoD4, it is obvious that 112 shaders are not needed, and it was not that G94 had improved architecture at all. That's all there really was to it. We got what we needed, or at least what I needed out of it.

Moral: Just because a GPU has a billion shaders, doesn't mean they all get used 100% of the time. The 9600GT performance was misleading without the knowledge of the previous statement. Now that it is understood, 9600GT performance makes sense, and can be justified in our minds.

And yes, your mind is surely thinking of an obvious question. What about other more shader intensive games?

Of course there are games that will need more shader power than the 9600GT has. But again, this was not our goal. Our goal was simply to see whether the 9600GT was an improved archictecture over G92, or just a cut down G92. Nothing more, nothing less.

But then, as always, it morphed into something else I guess.

If you want to conduct tests on every popular game at various resolutions while disabling shaders, that would be interesting, and I would be willing to help out.
I have 2 8800GT's coming by Saturday and a 700W SLI certified PSU by tomorrow.
After seeing 9800GTX specs, there's no more point in waiting.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
SLi huh? .. i KNEW you would go for it, Keys!

awesome ... and congratulations

.. although i remain unconvinced that we have seen real GTX specifications
... and you are right, if you want to explore nvidia's multi GPU, there is no need to wait .. i am sure you got a very good deal

i'll 'call' you in May with r700 and up the ante, shall i?

 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: keysplayr2003
Originally posted by: BFG10K
BFG, You took that CoD4 experiment way too seriously.
I took it seriously because Azn believes he is right based on the results, but the results can?t be used in such a fashion.

It's like me saying games aren't GPU limited and then running GLQuake at 320x240 to prove it.

It's great I ran the test but it doesn't really prove anything.

You completely missed the "little goal" of our little test.
What was the goal, may I ask? If it was to confirm was Azn was saying then it failed.

Again I?m not having a go at you, I?m just explaining to Azn why he can?t use that as evidence.

The "little goal" was to establish whether or not the G94 had improved architecture or not over G92. It turns out the number of shader processors is irrelevant, if they aren't used. In CoD4, it is obvious that 112 shaders are not needed, and it was not that G94 had improved architecture at all. That's all there really was to it. We got what we needed, or at least what I needed out of it.

Moral: Just because a GPU has a billion shaders, doesn't mean they all get used 100% of the time. The 9600GT performance was misleading without the knowledge of the previous statement. Now that it is understood, 9600GT performance makes sense, and can be justified in our minds.

And yes, your mind is surely thinking of an obvious question. What about other more shader intensive games?

Of course there are games that will need more shader power than the 9600GT has. But again, this was not our goal. Our goal was simply to see whether the 9600GT was an improved archictecture over G92, or just a cut down G92. Nothing more, nothing less.

But then, as always, it morphed into something else I guess.

If you want to conduct tests on every popular game at various resolutions while disabling shaders, that would be interesting, and I would be willing to help out.
I have 2 8800GT's coming by Saturday and a 700W SLI certified PSU by tomorrow.
After seeing 9800GTX specs, there's no more point in waiting.


I want to see some more tests. :beer:
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: apoppin
SLi huh? .. i KNEW you would go for it, Keys!

awesome ... and congratulations

.. although i remain unconvinced that we have seen real GTX specifications
... and you are right, if you want to explore nvidia's multi GPU, there is no need to wait .. i am sure you got a very good deal

i'll 'call' you in May with r700 and up the ante, shall i?

I am having a hard time buying that the 9800GTX is 1.5" longer just for an extra SLI connector, but alas, that's probably why.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
I bet your 400 bucks worth of sli 8800gt's will outpace 400 worth of a single 9800gtx. can't wait to find out
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |