R9 300 cards listed in new driver - R9 370 is a rebrand

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
And that they get much much lower than 40hz. Right now freesync monitors, if anything, requires even better GPUs to keep the minimum above 40 FPS.

So if we have to wait a couple of more years before getting into 30 or below. Then nothing is lost in terms of features
Just because the very first FreeSync capable monitors have a 40Hz minimum refresh rate doesn't mean that we will have to "wait a couple more years before getting into 30 or below". Since FreeSync is based off the VESA industry standard Adaptive-Sync, we'll definitely be seeing more and more FreeSync compatible monitors hit the market as time goes on. And I'm sure some of them will have a 30Hz minimum refresh. But it definitely won't take "years" before we see one.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81

R9 390x is looking like a dual GPU now. Remember when AMD was talking about how dual GPUs will be beneficial to VR? 1 GPU per eye. I am not sure if I like this actually.


I am going to call this right now and if I am wrong then will stand corrected. 390x will be full dual Tongas with HBM.

HD 7990 was 3% faster than a GTX 980, dual Tongas overclocked should net around 10-15% with HBM. Power consumption reduction will be due to HBM and maybe a better revision of the same die.

I better be wrong because this will be slower than Hawaii where crossfire isn't enabled.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
R9 390x is looking like a dual GPU now. Remember when AMD was talking about how dual GPUs will be beneficial to VR? 1 GPU per eye. I am not sure if I like this actually.

Fudzilla ran an article where they connected the dots that R9 390X must be a dual-GPU card because AMD's slides showed that VR runs better with multiple GPUs. That article made 3 flawed assumptions:

1) It assumed AMD didn't make a VR presentation with a single card
2) It assumed AMD didn't have 2x R9 390 cards in CF when making the presentation.
3) It assumed the maximum HBM1 VRAM size is 4GB per single GPU.

But let's stick to the dual-Tonga XT theory in the R9 390X:

First reason the theory doesn't align is there is way too much memory bandwidth when CF doesn't work which results in a highly inefficient use of expensive HBM memory. You have 640GB/sec memory bandwidth via 1.25Ghz HBM, so let's say 320GB/sec per each GPU. However, Tonga has 40% higher memory bandwidth efficiency than Hawaii/Tahiti. That means its 176GB/sec is equivalent to 246GB/sec on a 7970/290X (!). Despite so much memory bandwidth, Tonga can't even outperform an R9 280X. Why would you give each such GPU 320GB/sec bandwidth+40% memory bandwidth efficiency by using a very complex and expensive HBM just for the sake of lowering power usage? You would be wasting a ton of memory bandwidth. When CF doesn't work, you'll be wasting MORE than 50% of that 640GB/sec memory bandwidth since a 2048 SP Tonga with 32 ROPs and 128 TMUs is too slow to take advantage of it.

What you'll end up with is a $650 card that loses to a 1 year old R9 295X2. Since today consumers choose a GTX980 or a Titan X over a $650 R9 295X2, this strategy would be an instant fail.

Second reason this theory doesn't make sense - perf/watt. AMD would be better off then just taking R9 290/290X and slapping 2 of those on a single card with an AIO CLC + HBM because Hawaii has superior perf/watt than Tonga. So if AMD was severely strapped for time and cash, why use an inferior mid-range chip to start with?



That means if AMD were going to take full advantage of dual-GPU setup + HBM, why even bother with Tonga? Since they already managed to design R9 295X2 with 1Ghz 290Xs on there, might as well add HBM to Hawaii since R9 290/290X has better perf/watt, while that 120mm AIO CLC cooler can easily cope with 500W of power usage. At least with this solution when CF fails, you get 92% of 980's performance. But most importantly, AMD's new strategy is not to use mid-range performance chips in a dual-GPU solution (i.e., AMD themselves never made 5850X2, 6850/6870X2, 7870X2, etc.).

Third reason this theory doesn't make much sense - Tonga's performance when CF wouldn't work.



You see R9 290X is 59% faster than an R9 285. Even if the full Tonga XT is 10-15% faster, when CF ultimately doesn't work, you'll end up with R9 390X losing to R9 290X by a whopping 30-40%. AMD would never release a next gen GPU for $700 that would lose to an $280-300 R9 290X by 30-40% in modern games where CF isn't supported.

Forth reason - the dual-GPU card using mid-range size chips vs. NV's flagship monolith die has failed every generation. Why would AMD repeat the same unsuccessful strategy? 5970 vs. 580? Ya, that didn't work.

Fifth reason - the future. If you say R9 390X is made of dual Tongas, then the best single chip remains Hawaii. Guess what that means? That means for 14nm generation, AMD will be asked to make a chip at least 2.25X faster than R9 290X in 1 generational jump. Titan X (x1.5) and Pascal (at least x 1.5 over Titan X) = 2.25X increase over R9 290X. It makes more sense to design a way larger chip now to have the experience necessary to make the transition to 14nm large chip easier. AMD knows it has to increase die sizes sooner or later, might as well try it NOW on a very mature 28nm node that is more forgiving than a brand new 14nm node.

Sixth reason - dual-link HBM1 means we do not need 2 chips to have 8GB of HBM on 1 card. When Fudzilla published that article on 390X being made up of 2 chips, they were oblivious to the existence of dual-link HBM1.

-----

Now think about the die size:

R9 290X has 37.5% more TMUs, SPs, 100% more ROPs than an HD7970 despite a die size increase of 24.4% (438mm2 vs. 352mm2).



550mm2 28nm would be a 25.6% increase in die size. That means AMD could squeeze 37.5% more TMUs/SPs, and again double the ROPs. However, hold that thought. Remember how R9 285 has higher pixel fill-rate performance with only 32 ROPs against a 64 ROP R9 290(!)? That's right - Hawaii with 2X the ROPs loses.



That means, if AMD implements the pixel fill-rate/colour fill-rate efficiency improvements of Tonga into R9 390X, it doesn't need to double ROPs from 64 to 128. In fact, AMD would be fine with 64-96 ROPs. That means, at 550mm2 die size, they will have more room to increase SP/TMUs by more than 37.5%. So we could end up with a 4096 SP / 256 TMUs / 64-96 ROP 390X at 1-1.05Ghz clocks and none of the CF scaling issues we would have with Tonga XTs.
 
Last edited:

DiogoDX

Senior member
Oct 11, 2012
746
277
136
I like how all Nvidia guys now play games with sub 40 fps. I think they must go to consoles because a can't play anything below 60 fps average.

Only play BF4 without vsync and never go under 40 fps.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
And that they get much much lower than 40hz. Right now freesync monitors, if anything, requires even better GPUs to keep the minimum above 40 FPS.

So if we have to wait a couple of more years before getting into 30 or below. Then nothing is lost in terms of features

Awesome! Who do you think the first one to work at 15Hz-20Hz will be? Why shoot for 30Hz gaming? Let these hardware manufacturers get serious about the race to the bottom. Let's see who makes the first 240Hz monitor that we can game on @ <10% that speed, because that's the performance we all need. /sarc.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Dual GPU card makes no sense. The current 295x2 already beats the Titan X, how is making a dual card with 2 slower chips beneficial at all?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Except there are already working 30Hz capable FreeSync monitors. Since FreeSync just launched, don't you think it's reasonable to wait 1.5-2 years to see next wave of models? After all, it took 1.5 years just to get less than 10 GSync monitors on the market.....

What Freesync monitor goes down to 30hz?


Yup, and AMD will re-brand 2011 GPUs in 2016 too. The 512GB/sec HBM with 1/2 DP compute for 2016 APUs is just a made-up PowerPoint presentation test used to hire summer interns.



I still remember how you kept insinuating that AMD can't improve performance/watt on 28nm and implying that they will just re-brand 290X and be stuck with that level of performance until 2016 since only NV can improve perf/watt and make architectural advancements. Will make sure to revisit this point when R9 390 series launches.

As I already said, it's not a question of IF R9 390X will smoke your 980 but a question of by how much.

Still going on about my 980? I dont think anyone else but you cares. What does that tell you?

Who knows if the slide is real or not. And even if so, if a product will launch.

Its hard to argue against AMDs own drivers. They pretty much only left the 380 and 390 up for discussion. Entire mobile 300 series is a rebrand. And the desktop is from 310 to 370.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Just because the very first FreeSync capable monitors have a 40Hz minimum refresh rate doesn't mean that we will have to "wait a couple more years before getting into 30 or below". Since FreeSync is based off the VESA industry standard Adaptive-Sync, we'll definitely be seeing more and more FreeSync compatible monitors hit the market as time goes on. And I'm sure some of them will have a 30Hz minimum refresh. But it definitely won't take "years" before we see one.

The problem is its a niche product. Only AMD supports it. And even if Intel commits. It will first be with Cannonlake in the end of 2016.

And since the current monitors seems unable to go that far down. I dont think any change is right around the corner. It may need a new generation of scalars.

People buying this feature are most likely people with highend GPUs to begin with. So its not like there is a huge demand for lower. Gsync suffers from the same.

Awesome! Who do you think the first one to work at 15Hz-20Hz will be? Why shoot for 30Hz gaming? Let these hardware manufacturers get serious about the race to the bottom. Let's see who makes the first 240Hz monitor that we can game on @ <10% that speed, because that's the performance we all need. /sarc.

You confirmed what I said. There is little to no incentive to go lower.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
R9 390x is looking like a dual GPU now. Remember when AMD was talking about how dual GPUs will be beneficial to VR? 1 GPU per eye. I am not sure if I like this actually.


I am going to call this right now and if I am wrong then will stand corrected. 390x will be full dual Tongas with HBM.

HD 7990 was 3% faster than a GTX 980, dual Tongas overclocked should net around 10-15% with HBM. Power consumption reduction will be due to HBM and maybe a better revision of the same die.

I better be wrong because this will be slower than Hawaii where crossfire isn't enabled.

Is there any confirmation on HBM at all? Else the joker could be GDDR5. Besides power saving HBM doesnt really offer anything GDDR5 cant do. We have to wait till HBM2 for the breakthrough.

7990 was an ultra niche due to 500W. A dual Tonga using 300W would fit the bill for a card that could be much more widely used and accepted.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Besides power saving HBM doesnt really offer anything GDDR5 cant do.

Let us know how we can get 512GB-640GB/sec memory bandwidth out of GDDR5 this generation. No GPU maker in the world has so far made a 512-bit memory controller work with 8Ghz GDDR5.

What Freesync monitor goes down to 30hz?

CES 2015 had some. Everyone knows that FreeSync has a range of 9-240Hz, which means the 40-48Hz limitation on some monitors today is directly related to the panel chosen / manufacturer of that model.

Still going on about my 980? I dont think anyone else but you cares.

It's not about your 980, but the fact that for the last 6 months you have continuously implied that R9 300 series are all re-brands with some models getting "worthless" HBM. You have continuously implied that AMD can't improve perf/mm2 and perf/watt on 28nm node without a brand new architecture. Essentially all your posts were of the view that 980 will beat anything AMD has in perf/watt until 14nm node.

Even now you keep making statements how HBM1 offers nothing worthwhile other than power usage. After being so horribly wrong with your predictions in the past, in your shoes it would be wise to do extensive research before posting theories on future products.

When R9 390X thrashes a 980 in perf/watt and performance, we'll have to add 0/1 to your growing list of 0/XX predictions you got correct.

You confirmed what I said. There is little to no incentive to go lower.

You going on record that no FreeSync monitor released in the next 2-3 years will have refresh rates below 40Hz because the tech is flawed/there is no incentive?

Entire mobile 300 series is a rebrand.

So basically not 1mhz bump in clocks, no new features. Will write this down.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,842
5,457
136
Dual GPU card makes no sense. The current 295x2 already beats the Titan X, how is making a dual card with 2 slower chips beneficial at all?

Power consumption of the 295X2 is 500W. I imagine the 390X is going to be 275-300 W. It's possible they have made further improvements to CrossfireX to minimize the downside of two chips. So basically the 390X would be 2X "285X" while the 390 would be 2X 285.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
You confirmed what I said. There is little to no incentive to go lower.

Okay, I know how biased you are against AMD, but really? Really? That's a ridiculously bold claim unless you're saying that the tech is DoA and G-Sync will kill it off prematurely...
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
R9 390x is looking like a dual GPU now. Remember when AMD was talking about how dual GPUs will be beneficial to VR? 1 GPU per eye. I am not sure if I like this actually.


I am going to call this right now and if I am wrong then will stand corrected. 390x will be full dual Tongas with HBM.

HD 7990 was 3% faster than a GTX 980, dual Tongas overclocked should net around 10-15% with HBM. Power consumption reduction will be due to HBM and maybe a better revision of the same die.

I better be wrong because this will be slower than Hawaii where crossfire isn't enabled.



This is one of the most baseless theories I've ever read...
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
It's not about your 980, but the fact that for the last 6 months you have continuously implied that R9 300 series are all re-brands with some models getting "worthless" HBM. You have continuously implied that AMD can't improve perf/mm2 and perf/watt on 28nm node without a brand new architecture. Essentially all your posts were of the view that 980 will beat anything AMD has in perf/watt until 14nm node.

Even now you keep making statements how HBM1 offers nothing worthwhile other than power usage. After being so horribly wrong with your predictions in the past, in your shoes it would be wise to do extensive research before posting theories on future products.

When R9 390X thrashes a 980 in perf/watt and performance, we'll have to add 0/1 to your growing list of 0/XX predictions you got correct.

.
Its not about your 980 then why do you bring it up?

Countless times.....it is clear to many people you have an issue with the 980 and/or the people who own them. You came at me attacking my 980.....my entire system actually. I mean no disrespect but it is very clear that you have a 980 complex or something, clear to anyone but you.

You think people will be crying and upset if AMD finally launches strong single card flagship that dethroned their 980, you might even dream about it. But that is really funny to me. Cause I promise you, I would be extremely happy to see AMD come out with something great. The GPU market needs a strong AMD. Its not gonna bother us (most of us) one bit if AMD tops the 980 nine months after it launched.

I am not only looking forward to it, I would be very sad if they didnt
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Its not about your 980 then why do you bring it up?

Countless times.....it is clear to many people you have an issue with the 980 and/or the people who own them. You came at me attacking my 980.....my entire system actually. I mean no disrespect but it is very clear that you have a 980 complex or something, clear to anyone but you.

You think people will be crying and upset if AMD finally launches strong single card flagship that dethroned their 980, you might even dream about it. But that is really funny to me. Cause I promise you, I would be extremely happy to see AMD come out with something great. The GPU market needs a strong AMD. Its not gonna bother us (most of us) one bit if AMD tops the 980 nine months after it launched.

I am not only looking forward to it, I would be very sad if they didnt

really guy? if RS mentions the 980[or any card really] it is in the context of perf/$ -atleast his post history shows this. So it isn't some kind of "internet" grudge...
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
R9 390x is looking like a dual GPU now. Remember when AMD was talking about how dual GPUs will be beneficial to VR? 1 GPU per eye. I am not sure if I like this actually.

I am going to call this right now and if I am wrong then will stand corrected. 390x will be full dual Tongas with HBM.

HD 7990 was 3% faster than a GTX 980, dual Tongas overclocked should net around 10-15% with HBM. Power consumption reduction will be due to HBM and maybe a better revision of the same die.

I better be wrong because this will be slower than Hawaii where crossfire isn't enabled.

There is no way that AMD is going to release a flagship card that's actually slower than the previous corresponding flagship it's replacing (R9 295 X2). I can't think of any instance of this happening in the history of GPUs, ever. No one would buy it, and it would be laughed off the stage at all the review sites.

The only possible way that the R9 390X could be a dual-GPU card would be if AMD figured out some hardware method to get two or more chips to transparently appear to the system as a single powerful GPU, working correctly in all applications with no special software profiles needed. That seems unlikely.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
really guy? if RS mentions the 980[or any card really] it is in the context of perf/$ -atleast his post history shows this. So it isn't some kind of "internet" grudge...

That's exactly it. How often do they remember me recommending HD6970 over an unlocked 6950? What about HD5770 over the faster HD4890 despite the former having superior perf/watt? What about R9 290X over an R9 290 until the prices of 290X came down in price to within $40-50 of the 290?

See, from where I am sitting other than HDMI 2.0 and H.264 hardware decoding (niche features at the moment imo), when all is said and done the 980 is just 11% faster at 1440P and just 6% faster at 4K against a November 2013 R9 290X. I don't know how the media spun it but they made 980 seem like some revolutionary card when it absolutely isn't for people who already purchased 290/290X/780Ti 10 months before. It's only revolutionary if you are engineer from a perf/watt node perspective. For a gamer, the 980 at $515-550 is a total flop.

970 has great price/performance in the NV stack but 980 is grossly overpriced. When was the last time the fastest NV card over the 2nd tier offered so little performance increase at such a huge price premium? I can't remember. The cheapest version on Newegg is $515 vs. $280 for an R9 290X. Seriously, 84% more expensive, same VRAM amount, and barely more performance on avg. at high rez gaming unless we are talking about GW titles. If 980 was an AMD card, it would have been crucified for being the biggest rip-off of all time. After all, 7970 was but that card smoked the 580 by 40-80% when both were overclocked. 980 manages no such feat against a 290X OC or 970 OC, yet commands this astounding price premium.

Why do you think so many people online are 'desperate' for AMD's R9 300 series? Because between a $1K Titan X and the excellent $320 970, the extra premium 980 commands over 970/290X is like throwing $ into the toilet unless you find an poor uninformed soul who will buy your 980 card for most of its value weeks before 390/390X/GM200 6GB drops (feel sorry for that guy/gal). Look, it's not my money and some people earn $1000 a day, but since we aren't going to bring people's earnings into the equation, we have to look at each card objectively as we did in the past. By that metric the 980 is a total failure for the price. At least when you max overclocked a 480 or a 580, they really did start to pull away from the 5870/6970.

The amount of glorified hype the 980 received in the last 6 months vs. the amount of pessimism/negativity and "re-branding" desperation the R9 300 series received, one almost gets the feeling that some people want R9 300 series to fail hard. While the 970/980 are selling like hot cakes, mostly for perf/watt, marketing reasons and permanently damaged/tarnished reputation of R9 290 series, outside the $1K Titan X, the performance level for PC gamers has NOT moved since November 2013. For that reason, the GPU market desperately requires an adjustment in the performance curve -- hopefully delivered by R9 390, 390X and various GM200 6GB SKUs.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
970 has great price/performance in the NV stack but 980 is grossly overpriced. When was the last time the fastest NV card over the 2nd tier offered so little performance increase at such a huge price premium? I can't remember. The cheapest version on Newegg is $515 vs. $280 for an R9 290X. Seriously, 84% more expensive, same VRAM amount, and barely more performance on avg. at high rez gaming unless we are talking about GW titles. If 980 was an AMD card, it would have been crucified for being the biggest rip-off of all time. After all, 7970 was but that card smoked the 580 by 40-80% when both were overclocked. 980 manages no such feat against a 290X OC or 970 OC, yet commands this astounding price premium.

Why do you think so many people online are 'desperate' for AMD's R9 300 series? Because between a $1K Titan X and the excellent $320 970, the extra premium 980 commands over 970/290X is like throwing $ into the toilet unless you find an poor uninformed soul who will buy your 980 card for most of its value weeks before 390/390X/GM200 6GB drops (feel sorry for that guy/gal). Look, it's not my money and some people earn $1000 a day, but since we aren't going to bring people's earnings into the equation, we have to look at each card objectively as we did in the past. By that metric the 980 is a total failure for the price. At least when you max overclocked a 480 or a 580, they really did start to pull away from the 5870/6970.

The amount of glorified hype the 980 received in the last 6 months vs. the amount of pessimism/negativity and "re-branding" desperation the R9 300 series received, one almost gets the feeling that some people want R9 300 series to fail hard. While the 970/980 are selling like hot cakes, mostly for perf/watt, marketing reasons and permanently damaged/tarnished reputation of R9 290 series, outside the $1K Titan X, the performance level for PC gamers has NOT moved since November 2013. For that reason, the GPU market desperately requires an adjustment in the performance curve -- hopefully delivered by R9 390, 390X and various GM200 6GB SKUs.

The 980 vs. 970 situation is very similar to the 580 vs. 570 ($499 vs. $349).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The 980 vs. 970 situation is very similar to the 580 vs. 570 ($499 vs. $349).

No.

GTX580 was 15-16% faster for a 43% price premium ($500/$350)
GTX980 is 15-16% faster for a 67% price premium ($550/$330)

NV is charging a 56% premium (67%/43%) vs. 570/580 generation for each additional 1% increase in performance between a 2nd tier and a 1st tier card when moving from a 970 to a 980.

Here is more.

Per Sweclockers today,

In modern games today, GTX980 only beats 780Ti by about 11% at 1440P and 12% at 4K.
In modern games back then, GTX680 beat GTX580 by 42% at 1080P and 45% at 1440P (using 7870 = 580)

Don't like that site? Modern gaming review



At 2560x1600, GTX680 is 51% faster than a 580 (NV's last gen flagship), but 980 is only 11% faster than a 780Ti (NV's last gen flagship).

Even if we only look at 680 vs. 580 and 780Ti vs. 980, 980 is an overpriced mid-range chip that has perf/watt marketing fluff going for it - put that perf/watt marketing aside and consider the context, 980 easily cements itself as the least impressive $500-550 NV GPU ever made, the least impressive generational leap at the $500-550 level from NV ever. That's why GM200 6GB and R9 390/390X are absolutely necessary to correct this stagnation and overpricing market situation. IMO, the reason 980 sold so well is not because of how good that card really is, but rather because it was the fastest single GPU for a long time and by default people who build new rigs or upgrade often get the fastest single GPU. Marketing and sales wise, 980 is a wild success as a result of AMD not showing up on time. However, when taking into account 980's price and the time-frame from 290X/780TI's launches, the generation improvement leap wise, 980 is an utter disappointment. Never in the history of AMD/NV/ATI has a next gen card priced at $550 been this unimpressive vs. the cards preceding it.

Once we consider 4K gaming benchmarks over 780Ti/290X, 980 is an embarrassment for a next gen $550 card. If R9 300 series flops, this desktop GPU generation will go down in history as one of the worst of all time, if not the worst. Hopefully GM200 6GB delivers if R9 390 flops.

As a side-note, 960 already cemented its place in history as the worst x60 successor from NV ever.

Only 11% separates an after-market 960 and an after-market 760. At no point in NV's history has a next gen x60 card ever been only 11% faster than the predecessor.
http://www.computerbase.de/2015-01/nvidia-geforce-gtx-960-im-test/12/
 
Last edited:

lyssword

Diamond Member
Dec 15, 2005
5,761
25
91
really 980 is not too bad, considering they didn't shrink down like 680 vs 580. Problem is like you say, is price
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |