[Sweclockers] Radeon 380X coming late spring, almost 50% improvement over 290X

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathReborn

Platinum Member
Oct 11, 2005
2,757
753
136
4GB is not going to be enough for 4k going forward though. I can see a 380X being 4GB for the time being but I do not see GM200 being stuck at 4GB of memory and if AMD do have a 390X to take it on it is going to need 6-8GB to compete at 4k.

I think they can only make 4GB max for 1st gen HBM and not 2GHz either. 8GB will have to wait for the next round of cards after 3x0 at the earliest.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
4GB is not going to be enough for 4k going forward though. I can see a 380X being 4GB for the time being but I do not see GM200 being stuck at 4GB of memory and if AMD do have a 390X to take it on it is going to need 6-8GB to compete at 4k.

Begging the question. Currently, you want to have enough VRAM to hold all of the data a GPU needs to avoid memory swaps, which greatly decrease performance. Given that HBM is an order of magnitude faster than GDDR5 and delta color compression is being used, are both technologies fast enough and efficient enough to not require an increase in the amount of VRAM?
 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
I think they can only make 4GB max for 1st gen HBM and not 2GHz either. 8GB will have to wait for the next round of cards after 3x0 at the earliest.

They would be better off with GDDR5 with the colour compression used in Tonga (or a further improved version). 4GB will not cut it unless they intend the GM200 competitor to be a short lived release.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
They would be better off with GDDR5 with the colour compression used in Tonga (or a further improved version). 4GB will not cut it unless they intend the GM200 competitor to be a short lived release.

No GPU is better off with half as much bandwidth.

So you wont be able to run Super Duper Ultra Textures, boo hoo.
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
They would be better off with GDDR5 with the colour compression used in Tonga (or a further improved version). 4GB will not cut it unless they intend the GM200 competitor to be a short lived release.
Completely plausible & in fact a high probability given the fact that there are rumors which suggest that the 3xx series will probably be made on 28nm, whether GF or TSMC is still speculation, so if/when they switch to a better node they'll have more legroom to do all the crazy stuff, like including 8GB HBM for the next GPU flagship
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Exar333 Open mouth, insert foot.

The thing everyone is saying is not that the wattage doesn't matter. They are saying that performance is king. In the counter examples you are saying that the wattage is preventing performance and overclocking which in some cases could be true - however if the performance is still better with the heat than another product in that price range, then it does not matter.

Do my CF 290s get hot and prevent extraction of their maximum potential - yes they do. Could have I have gotten more performance for $480? No, even with the heat and power it was still the best deal. Therefor, wattage was irrelevant and will be in most enthusiast situations.

If you are doing custom watercooling to get maximum performance from your equipment that may be another matter entirely because it changes the cost structure. However if we are talking about performance per dollar then in most cases the money spent on watercooling could have been spent somewhere else for performance, with the exception of when using the top of the line GPU etc already.

If we are talking pure performance and cost is not factored then the card using more wattage could still be the best choice as once cooled properly it may be faster than cards that used less wattage and ran cooler to begin with.

So as you can see from my example it is possible when looking at the metrics of performance or performance/$ that a high wattage card is better. Of course the opposite is true as well, but to say that wattage and efficiency are a big deal is simply marketing.

In the end, look at all the data for the GPU or CPU you are considering and compare in that price range. Find the one that gives the best performance in the metrics you desire and purchase. If a product gives good performance and your rig can support it then do not let marketing FUD prevent you from making a smart purchase.

No.....
What some people are saying is that they would take higher power consumption and lesser performance, as long as its an AMD product.

Also, flash sales are nice. They should be posted in a hot deals section.....but making a huge paged rant based off one is a little strange... It's extremely funny when people click on the link to find that it doesn't even exist. The deal nor the product, its bogus. So two thumbs up, what a laugh

And as for power consumption being meaningless for enthusiasts.
I don't think many of you are getting it. Totally missing why its impactful.

To try to keep it simple, people are just more impressed with the gm204. Just like people were impressed with Conroe and the athlon64. Enthusiasts have a love for the technologies and enjoy being a part of it. They don't care about saving a few dollars, not one bit. You are preaching that stuff to the wrong crowd. They don't care about it. They love being a part ofit. They love the technology and having impressive components. The gm204 is technologically impressive and people cling to that.

It's not much at all to do with power consumption, not by itself. Not at all. Not with any thing to measure against. See, people are siding with the HW they find the more impressive. The more special. Power consumption means nothing unless all of a sudden you have other chips to compare it to. Then you can gain more or less respect for the accomplishment. People are much more willing to accept higher power consumption if it means a more powerful/ capable chip. It is not an issue at all then, because they still can find that chip superior and it be impressionable on them.

But having a more power hungry architecture that doesn't perform as good as the competition, its a much harder to be impressed with that.

See, true enthusiast. True PC enthusiasts are in it for the love of the technology. They are impressionable and are looking to buy the HW that complements that love.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
No.....
What some people are saying is that they would take higher power consumption and lesser performance, as long as its an AMD product.

Also, flash sales are nice. They should be posted in a hot deals section.....but making a huge paged rant based off one is a little strange... It's extremely funny when people click on the link to find that it doesn't even exist. The deal nor the product, its bogus. So two thumbs up, what a laugh

And as for power consumption being meaningless for enthusiasts.
I don't think many of you are getting it. Totally missing why its impactful.

To try to keep it simple, people are just more impressed with the gm204. Just like people were impressed with Conroe and the athlon64. Enthusiasts have a love for the technologies and enjoy being a part of it. They don't care about saving a few dollars, not one bit. You are preaching that stuff to the wrong crowd. They don't care about it. They love being a part ofit. They love the technology and having impressive components. The gm204 is technologically impressive and people cling to that.

It's not much at all to do with power consumption, not by itself. Not at all. Not with any thing to measure against. See, people are siding with the HW they find the more impressive. The more special. Power consumption means nothing unless all of a sudden you have other chips to compare it to. Then you can gain more or less respect for the accomplishment. People are much more willing to accept higher power consumption if it means a more powerful/ capable chip. It is not an issue at all then, because they still can find that chip superior and it be impressionable on them.

But having a more power hungry architecture that doesn't perform as good as the competition, its a much harder to be impressed with that.

See, true enthusiast. True PC enthusiasts are in it for the love of the technology. They are impressionable and are looking to buy the HW that complements that love.

Well said.

While perf/$ is important, it's not the only metric, unless you have little money and that's all you focus on (perfectly alright). True enthusiasts love the technology and all the work/fun that come with it.

Efficiency is more and more important. Choosing a AMD FX CPU + 290X might make a SFF build completely untenable vs. a more efficient Intel CPU + 970. You are talking a few hundred watts less on a moderate OC. That could be the very different between throttling and not in your build. That matters.

There is more to life than perf/$. That's why halo products exist...some people just throw money because they want the best (and most expensive) while others are happy to pay a premium for a premium product in terms of engineering and efficiency.

That's why I choose a 5870 over a 480 years ago. That's why I choose a 970 over a 290 last fall. Same with A64 vs. P4 and now again with a i5/i7-K vs. 8xxx CPU.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
See, people are siding with the HW they find the more impressive. The more special. Power consumption means nothing unless all of a sudden you have other chips to compare it to. Then you can gain more or less respect for the accomplishment. People are much more willing to accept higher power consumption if it means a more powerful/ capable chip. It is not an issue at all then, because they still can find that chip superior and it be impressionable on them.

But having a more power hungry architecture that doesn't perform as good as the competition, its a much harder to be impressed with that.

That, and it's a great straw man to use for posters seeking to divert attention away from less desireable comparisons when they irrationally favor one brand over another; (e.g. to distract discussion onto meager differences in power consumption in order to prevent discussion about performance per dollar or max performance at a given price point)

while others are happy to pay a premium for a premium product in terms of engineering and efficiency..

Oh come on. Are we really devolving to "premium product" hand waving? I get wanting the newest, shiniest bit and bobbles but calling it "premium" as if that 1) means anything, 2) is in any way measurable, 3) isn't 100% subjective


Real talk:
Paying $100 more for a 970 instead of a 290 isn't about power consumption, because the difference isn't enough to amount to anything unless you're absolutely pushing your PSU. It's not about performance, because aftermarket to aftermarket the performance difference amounts to 5-7% which is imperceptible. It's about either wanting the shiny new one, or buying nVidia even when its more expensive for the same thing out of brand loyalty/fanaticism. Maybe a few small exceptions for people who need CUDA or other nVidia specific feature sets. Then, people realize that's actually emotionally driven and then after the fact rationalize their purchase via power consumption or something else measurable, so they can act like they did it due to some objective superiority. (In fact -- I predict when the 380x/390x/3xx comes out it'll be pretty easy to spot the nVidia loyalty buyers versus the newest-shiniest buyers because some of the new-shinys will move over if the 380x really does have HBM)

For whatever reason people fear saying simply "it's new tech and I like new tech" and derail thread after thread with rationalizations and the people debunking those rationalizations. I don't see anything wrong with wanting the new shiny tech. It's part of being a tech enthusiast. It's tiring seeing repeated dishonest sophistry and rationalization instead of honesty. You can buy on emotion. Just don't try and tell me it's logical...
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
To try to keep it simple, people are just more impressed with the gm204. Just like people were impressed with Conroe and the athlon64.

You mean the same Conroe that brought $300 CPUs that easily trounced what started the day priced at $1000 CPUs?

And this is an argument in favor of ignoring price/performance?

The Athlon 64 was part of a bitter war between intel and AMD fanboys who were rooting for their company to do well and (crucially because it's the same problem we have today) for the other company to do poorly because both made them feel better about their purchase, and that fuzzy warm feeling of being right was worth more to them than the benefits of the other side doing well.

When people did a retrospective on awesome graphics cards, a huge fraction of the ones people loved were ones they were able to unlock to get better performance from without paying much or such standouts as the 8800 GT and 5850 that hit really nice price-performance points. Notably absent were behemoths like the 8800 GTX.

Conroe got me to build a computer, the 8800 GT made me love it, and both because of price/performance. I just recently got the budget to blow on high end stuff, and guess what? I didn't splash all my money on silicon, and instead I got a bunch of nice peripherals and stuff to make my overall computer experience better.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
So you wont be able to run Super Duper Ultra Textures, boo hoo.

Gloomy said:
Face it, you 2GB 680/770 guys got stiffed.

Source

So which is it? Did the 2GB 680/770 guys get stiffed, or were they just not able to run the "Super Duper Ultra Textures" in Watch Dogs?

And if the next AMD cards only come with 4GB, are they gettings stiffed if VRAM runs out @ 4K maxed? Or are they asking too much for a top-tier card?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Source

So which is it? Did the 2GB 680/770 guys get stiffed, or were they just not able to run the "Super Duper Ultra Textures" in Watch Dogs?

And if the next AMD cards only come with 4GB, are they gettings stiffed if VRAM runs out @ 4K maxed? Or are they asking too much for a top-tier card?

There's a difference between 2GB and 4GB.

And an even bigger difference between 300GB/s and 600GB/s.

You're the one being disingenuous here.

The 770 and 280X are the same speed (actually the 280X is much faster now, but at the time they were more similar), so the only difference was one had less VRAM. So you did get stiffed if you bought a 770-- you paid more for a slightly slower (now massively slower) GPU that had less VRAM.

But if the choice is between a 4GB GPU that is twice as fast as a 6 or 8GB GPU? Dude. D:
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
That, and it's a great straw man to use for posters seeking to divert attention away from less desireable comparisons when they irrationally favor one brand over another; (e.g. to distract discussion onto meager differences in power consumption in order to prevent discussion about performance per dollar or max performance at a given price point)



Oh come on. Are we really devolving to "premium product" hand waving? I get wanting the newest, shiniest bit and bobbles but calling it "premium" as if that 1) means anything, 2) is in any way measurable, 3) isn't 100% subjective


Real talk:
Paying $100 more for a 970 instead of a 290 isn't about power consumption, because the difference isn't enough to amount to anything unless you're absolutely pushing your PSU. It's not about performance, because aftermarket to aftermarket the performance difference amounts to 5-7% which is imperceptible. It's about either wanting the shiny new one, or buying nVidia even when its more expensive for the same thing out of brand loyalty/fanaticism. Maybe a few small exceptions for people who need CUDA or other nVidia specific feature sets. Then, people realize that's actually emotionally driven and then after the fact rationalize their purchase via power consumption or something else measurable, so they can act like they did it due to some objective superiority. (In fact -- I predict when the 380x/390x/3xx comes out it'll be pretty easy to spot the nVidia loyalty buyers versus the newest-shiniest buyers because some of the new-shinys will move over if the 380x really does have HBM)

For whatever reason people fear saying simply "it's new tech and I like new tech" and derail thread after thread with rationalizations and the people debunking those rationalizations. I don't see anything wrong with wanting the new shiny tech. It's part of being a tech enthusiast. It's tiring seeing repeated dishonest sophistry and rationalization instead of honesty. You can buy on emotion. Just don't try and tell me it's logical...

+1
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
But if the choice is between a 4GB GPU that is twice as fast as a 6 or 8GB GPU? Dude. D:

And if the future of gaming @ 4K requires 6GB for ultra textures, are all 4GB buyers getting stiffed? That's all I'm asking, no need for paragraphs.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
got the budget to blow on high end stuff, and guess what? I didn't splash all my money on silicon, and instead I got a bunch of nice peripherals and stuff to make my overall computer experience better.

The money i saved getting 3x290 instead of 3x290X is going towards another SSD.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
And if the future of gaming @ 4K requires 6GB for ultra textures, are all 4GB buyers getting stiffed? That's all I'm asking, no need for paragraphs.

Yes-- they would be getting stiffed. If the option was between 4GB and 8GB on the same GPU.

But it isn't. AMD can either make a slow GPU with GDDR5 or a fast GPU with HBM-- unlike Nvidia, they're not choosing to stiff you. It's one or the other.

Nvidia could have chosen to ship 4GB 770s. But they didn't. And people ate it up. And they regret it. That's getting stiffed.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I'd say they'd be getting stiffed less, since those cards can't handle 4K very well to start with. The problem is cards that were competitive with their competitors now aren't because part of the card that could've been better was under-specced and it's holding them back. That's much more of a stiffing than GPUs that can't handle 4K not being able to handle 4K because of the memory in addition to the GPU.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yes-- they would be getting stiffed. If the option was between 4GB and 8GB on the same GPU.

But it isn't. AMD can either make a slow GPU with GDDR5 or a fast GPU with HBM-- unlike Nvidia, they're not choosing to stiff you. It's one or the other.

Nvidia could have chosen to ship 4GB 770s. But they didn't. And people ate it up. And they regret it. That's getting stiffed.

Wait what? Since when does the type of memory have anything to do with the actual GPU power and capabilities?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Real talk:
Paying $100 more for a 970 instead of a 290 isn't about power consumption, because the difference isn't enough to amount to anything unless you're absolutely pushing your PSU. It's not about performance, because aftermarket to aftermarket the performance difference amounts to 5-7% which is imperceptible. It's about either wanting the shiny new one, or buying nVidia even when its more expensive for the same thing out of brand loyalty/fanaticism. Maybe a few small exceptions for people who need CUDA or other nVidia specific feature sets. Then, people realize that's actually emotionally driven and then after the fact rationalize their purchase via power consumption or something else measurable, so they can act like they did it due to some objective superiority. (In fact -- I predict when the 380x/390x/3xx comes out it'll be pretty easy to spot the nVidia loyalty buyers versus the newest-shiniest buyers because some of the new-shinys will move over if the 380x really does have HBM)

For whatever reason people fear saying simply "it's new tech and I like new tech" and derail thread after thread with rationalizations and the people debunking those rationalizations. I don't see anything wrong with wanting the new shiny tech. It's part of being a tech enthusiast. It's tiring seeing repeated dishonest sophistry and rationalization instead of honesty. You can buy on emotion. Just don't try and tell me it's logical...

How's this for honesty. I've had enough issues with AMD drivers in the past and heard enough stories about poor crossfire support that I wanted two Nvidia cards instead. One card isn't enough for my performance expectations and SLI is much more well supported overall.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
How's this for honesty. I've had enough issues with AMD drivers in the past and heard enough stories about poor crossfire support that I wanted two Nvidia cards instead. One card isn't enough for my performance expectations and SLI is much more well supported overall.

See this is exactly the right thing to do. Now if anyone wants to talk about the actual issue at the heart of it, you can, instead of circling around pointlessly talking about power consumption when really something else is the true problem
 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
Yes-- they would be getting stiffed. If the option was between 4GB and 8GB on the same GPU.

But it isn't. AMD can either make a slow GPU with GDDR5 or a fast GPU with HBM-- unlike Nvidia, they're not choosing to stiff you. It's one or the other.

Nvidia could have chosen to ship 4GB 770s. But they didn't. And people ate it up. And they regret it. That's getting stiffed.

I am sorry but a 4096SP, 96 ROP GPU with 8GB of GDDR5 @ 6Ghz on a 512bit bus with the colour compression improvements and front end improvements found in Tonga will be just fine for bandwidth and clocked correctly performance will be pretty good too, probably 290x +50% on the low end. The 285 is faster than the 290 in Pixel Fillrate tests despite having just 176GB/s of bandwidth and 32 ROPS.

If HBM is true, which does seem to be the case then there are only so many ways to make it fit with everything else that is being rumoured.
1) A short lived GM200 competitor with 4GB of HBM until 20nm cards are released.
2) Use it in a 380 series card for the power savings and the manufacturing experience. 4GB would be enough in this case.
3) They can actually make 6/8GB fit on a GM200 competitor.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Real talk:
Paying $100 more for a 970 instead of a 290 isn't about power consumption, because the difference isn't enough to amount to anything unless you're absolutely pushing your PSU. It's not about performance, because aftermarket to aftermarket the performance difference amounts to 5-7% which is imperceptible. It's about either wanting the shiny new one, or buying nVidia even when its more expensive for the same thing out of brand loyalty/fanaticism. Maybe a few small exceptions for people who need CUDA or other nVidia specific feature sets. Then, people realize that's actually emotionally driven and then after the fact rationalize their purchase via power consumption or something else measurable, so they can act like they did it due to some objective superiority. (In fact -- I predict when the 380x/390x/3xx comes out it'll be pretty easy to spot the nVidia loyalty buyers versus the newest-shiniest buyers because some of the new-shinys will move over if the 380x really does have HBM).

Or you should re-read my post above and realize its much more valid than calling everyone who buys nvidia loyalist fanboy.

Actually, you are completely not aware that when you say this:

For whatever reason people fear saying simply "it's new tech and I like new tech" and derail thread after thread with rationalizations and the people debunking those rationalizations. I don't see anything wrong with wanting the new shiny tech. It's part of being a tech enthusiast. It's tiring seeing repeated dishonest sophistry and rationalization instead of honesty. You can buy on emotion. Just don't try and tell me it's logical...
You are complementing my post. You are actually on the same exact page.

I don't know how you didn't realize that
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
You mean the same Conroe that brought $300 CPUs that easily trounced what started the day priced at $1000 CPUs?

And this is an argument in favor of ignoring price/performance?

The Athlon 64 was part of a bitter war between intel and AMD fanboys who were rooting for their company to do well and (crucially because it's the same problem we have today) for the other company to do poorly because both made them feel better about their purchase, and that fuzzy warm feeling of being right was worth more to them than the benefits of the other side doing well.

When people did a retrospective on awesome graphics cards, a huge fraction of the ones people loved were ones they were able to unlock to get better performance from without paying much or such standouts as the 8800 GT and 5850 that hit really nice price-performance points. Notably absent were behemoths like the 8800 GTX.

Conroe got me to build a computer, the 8800 GT made me love it, and both because of price/performance. I just recently got the budget to blow on high end stuff, and guess what? I didn't splash all my money on silicon, and instead I got a bunch of nice peripherals and stuff to make my overall computer experience better.

Do what?
Huh????

Conroe wasn't a 300 chip, it was an entire line up. Your not making much sense here. There were expensive chips and budget.

You do realize that the 970 was a 300 dollar chip that can best the original 1000$ Titan, don't you?

Kind of a strange and out of logic response to my post. I am not sure you got it.
 

CakeMonster

Golden Member
Nov 22, 2012
1,428
535
136
If you're the kind of person who is bothered by driver issues, you probably shouldn't run two cards to begin with, regardless of brand.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That, and it's a great straw man to use for posters seeking to divert attention away from less desireable comparisons when they irrationally favor one brand over another;... It's tiring seeing repeated dishonest sophistry and rationalization instead of honesty. You can buy on emotion. Just don't try and tell me it's logical...

Such an excellent post. :thumbsup::thumbsup: :wub:

And if the next AMD cards only come with 4GB, are they gettings stiffed if VRAM runs out @ 4K maxed? Or are they asking too much for a top-tier card?

1. You won't be maxing any next gen games at 4K on a single GM200/390X. Why? Because at best they will come close to 295X2/980SLI but even those can't max out 4K in today's games.

2. If someone is going GM200/390X dual cards or more and plans to keep them for longer than 2 years for 4K sure. However, in practice, given NV's pricing, they'll charge more for GM200. $100-150 premium is not out of the question, which means it'll be simply better to pocket $200-300 savings towards 14/16nm GPU upgrades with 6-8GB of VRAM.

3. In practice, it's more interesting to discuss 2 of AMD's best cards against GM200. Why?

5850x2 vs. 480 - cost almost the same, the former was faster
6950x2 unlocked vs. 580 - cost almost the same, no comparison in terms of performance
7950s OC vs. 680 4GB OC - cost almost the same, no comparison in terms of performance.
R9 290 OC vs. 780Ti OC - cost almost the same, no comparison in terms of performance

While GM200 6GB might sound amazing for e-peen, but in practice I bet R9 390 CF won't cost much more and absolutely level it at 1440-4K. I'd take 390 CF 4GB over GM200 6GB because no way a single GM200 will be good enough for 4K against dual 2nd tier AMD cards unless AMD's 300 series is a total flop. I am pretty interested to see benches of GM200 2nd tier vs. R9 390 non-X.

Obviously if in the next 6 months we get a deluge of games that all max out 4GB of VRAM of 980 SLi at 4K, then you'll be 100% correct. However, today, based on Kyle's testing, it seems 980 SLI and 290X CF run out of GPU power before they run out of VRAM at 4K and AA. I doubt GM200 will be way faster than 980 SLI/290X CF for 4 vs. 6GB of VRAM to even matter. But ya, if going dual GM200s/390Xs, it'll be an interesting point of benchmark investigation.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |