[Benchlife] R9 480 (Polaris 10 >100w), R9

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41

I love the chain of thought that led them to assume it HAS TO BE POLARIS, because the CU count doesn't match any other AMD product, as if the whole thing wasn't a custom design done for Sony to begin with. Disabling CUs is a thing, especially on something that needs very consistent yields.

Polaris, especially the 10, is not a product AMD can dump into a low margin bin. It's new, it's going to have yield issues regardless of GloFo/AMD deals and most importantly, it has to do heavy work in the OEM and mid range consumer markets, not in the 'at cost' console pit.

If AMD gives away their precious Polaris wafers to consoles, then Polaris is either a lemon or they're suicidal.

Edit: Well, there's the possibility that they got some insane deal from Sony. Stranger things have happened.
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,726
1,342
136

AMD has created both of its current-gen console processors so far by taking older, off-the-shelf components and disabling a couple of compute units. In effect, Xbox One got the Radeon HD 7790, while PlayStation 4 got a more capable, semi-custom Radeon HD 7870. Here's where things get interesting - the 36 compute unit count cannot comfortably fit any of AMD's existing GPUs. It suggests that Sony and AMD have pushed the boat out, that they are using the upcoming Polaris technology.

Specifically, 36 compute units paired with a 256-bit memory bus sounds uncannily like the rumoured spec for one of two new Radeon graphics chips AMD has in development, codenamed Polaris 10...

...Sony specifically says that an 'Improved AMD GCN' is used, giving us only two real alternatives to choose from - Polaris, which almost certainly has 36 compute units, or the older Tonga, which definitely has just 32.

They are... very confused to say the least. Number of CUs has nothing to do with anything except the number of CUs
 

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
Btw, PS4K Neo confirmed specs.

http://www.eurogamer.net/articles/digitalfoundry-2016-sonys-plan-for-playstation-4k-neo-revealed

https://www.youtube.com/watch?v=8JJbWo8y58M

These guys claimed they have seen the documentation that Sony have sent to developers (along with devkits).

PS4 and Neo will co-exist, Sony has tough guidelines that devs must adhere to to publish games on their platform. Basically games need to target both, PS4 quality must be like the current situation, target 1080p/30 fps. Neo extends that, devs can pick 60 fps or 4K/30 fps.

There's now leaks that MS is working on the Xbox Next, also going with Polaris APU.

All 3 major consoles are going to be on the same tech level, x86/GCN, Sony/MS will be backwards compatibility. Nintendo jumping onboard to the x86/GCN train. Devs will love this because it makes their cross-platform development much easier.

ETA? Sony is accepting "Neo" game submission in August. They require all games released after October to support Neo. This likely points to an October release for the PS4 Neo.
From your Eurogamer link.

Memory: We're still at 8GB of GDDR5, with a 24 per cent boost to bandwidth compared to the original PS4. The current machine uses 5.5gbps memory modules. Basic maths suggests that Sony has pushed this to the same 7.0gbps modules we see on high-end graphics cards like the GTX 980 and GTX 980 Ti. There are some concerns here. The boost to bandwidth isn't exactly huge, it will still be in contention with CPU utilisation (they both share the same interface), and the bandwidth doesn't scale particularly well with the mooted GPU boost, which - to be frank - is massive.


Could this be a window into Polaris 10 memory compression and optimizations?
 
Feb 19, 2009
10,457
10
76
Sony's doc talk about improved AMD GCN.

Tonga? Not a chance, it's not much better than GCN 1.1 that's in the PS4.

Polaris is designed to be cheap (it's tiny). Raja and others at AMD have kept on saying it's about value, performance per dollar.

Despite being "value", don't expect it will be low margin either because it's a small die the cost/yield should be fine. NX is rumored to be selling at a loss.

PS4K could have a nice premium over the regular PS4 and it will sell well. Imagine telling all the console crowd they can game at 60 fps like the PCMR or even 4K.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
If AMD gives away their precious Polaris wafers to consoles, then Polaris is either a lemon or they're suicidal.

How? Polaris would be baked into the APU.

Suicidal? Maybe forward thinking is the correct response? Seems like at least on the PS4 side both the older and the newer architecture would be optimized for. Kind of sounds like a double whammy too me.
 
Feb 19, 2009
10,457
10
76
How? Polaris would be baked into the APU.

Suicidal? Maybe forward thinking is the correct response? Seems like at least on the PS4 side both the older and the newer architecture would be optimized for. Kind of sounds like a double whammy too me.

Polaris 10 PC SKU could well be $299 or below.

232mm2, my rough estimate is ~260 dies per 300mm wafer.

Assuming costs around $8K per wafer (pretty high estimate, double the cost of 28nm). Each chip is ~$30 in wafer costs.

Add yield and binnings (cut dies, 36/40 CUs), potentially $40 per chip to factor in for some chips that fail to make the cut at all.

If they sell Polaris chips at $100 to AIBs, that's >100% margins.

The jaguar cores are already a small portion of the APU, on 14nm FF, it would be tiny. Total APU size would be ~300mm2. Let's give it some room and say 320mm2.

Per wafer is ~177 dies. Since we have 36 CU, it's a harvested part. Say a total of 150 dies that make the harvest, the rest are junk.

Each chip costs $53 for AMD. How much do you think they can sell it for to Sony, Nintendo and later, MS?

Edit: Assume the new consoles carry a price premium, say, $499. GDDR5 is dirt cheap, as is the crap mechanical HDD they use. AMD can sell it for $200 and these guys would still have to agree. When one does, the others will be pressured to accept. What are their alternatives? Not to release a new console and be left behind? Go with ARM/other GPU and ruin their backwards compatible plans which means gamers are less likely to support the new consoles... nope, they have no choice. Go with it, jack up the console prices if they need to and get their marketing game going for 60 fps fluid gameplay or 4K/VR experience!

AMD played the low margins to secure the x86/GCN win in current consoles, but in the next consoles, it's about profits and margins as the ecosystem is locked in.
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Polaris 10 PC SKU could well be $299 or below.

232mm2, my rough estimate is ~260 dies per 300mm wafer.

Assuming costs around $8K per wafer (pretty high estimate, double the cost of 28nm). Each chip is ~$30 in wafer costs.

Add yield and binnings (cut dies, 36/40 CUs), potentially $40 per chip to factor in for some chips that fail to make the cut at all.

If they sell Polaris chips at $100 to AIBs, that's >100% margins.

The jaguar cores are already a small portion of the APU, on 14nm FF, it would be tiny. Total APU size would be ~300mm2. Let's give it some room and say 320mm2.

Per wafer is ~177 dies. Since we have 36 CU, it's a harvested part. Say a total of 150 dies that make the harvest, the rest are junk.

Each chip costs $53 for AMD. How much do you think they can sell it for to Sony, Nintendo and later, MS?

Edit: Assume the new consoles carry a price premium, say, $499. GDDR5 is dirt cheap, as is the crap mechanical HDD they use. AMD can sell it for $200 and these guys would still have to agree. When one does, the others will be pressured to accept. What are their alternatives? Not to release a new console and be left behind? Go with ARM/other GPU and ruin their backwards compatible plans which means gamers are less likely to support the new consoles... nope, they have no choice. Go with it, jack up the console prices if they need to and get their marketing game going for 60 fps fluid gameplay or 4K/VR experience!

AMD played the low margins to secure the x86/GCN win in current consoles, but in the next consoles, it's about profits and margins as the ecosystem is locked in.

seems a bit tragic.
 

Adored

Senior member
Mar 24, 2016
256
1
16
AMD is more likely to eat some of the additional costs of going with interposers and HBM in future consoles rather than make more profit. Right now the important thing is getting it established and driving down costs.

A company like MS could obviously eat those costs easier but they'd want a good return on it.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I love the chain of thought that led them to assume it HAS TO BE POLARIS, because the CU count doesn't match any other AMD product, as if the whole thing wasn't a custom design done for Sony to begin with. Disabling CUs is a thing, especially on something that needs very consistent yields.

It's not that they can't make a 28nm product with that number of cores if they want. It's that a 28nm product, custom or otherwise, with that core count and clock speed would be far too big, power-hungry, and hot to work effectively in a console. That means it has to be FinFET, and since AMD already has Polaris designed for that process and a Wafer Supply Agreement with Global Foundries to guarantee supply, that's the logical choice.

Polaris, especially the 10, is not a product AMD can dump into a low margin bin. It's new, it's going to have yield issues regardless of GloFo/AMD deals and most importantly, it has to do heavy work in the OEM and mid range consumer markets, not in the 'at cost' console pit.

First of all, yields will probably be better than you think. Remember, GloFo isn't developing this process from scratch, they're licensing an already existing process that Samsung has used (albeit in a more primitive form, 14LPE) for quite some time.

Secondly, console APUs have extra CUs on-die specifically to increase yields. The existing PS4 APU, for instance, has 1280 shaders, but two CUs are disabled to give 1152 active. Presumably the same thing will be done here.

You're also assuming console sales must be low-margin, or at least lower margin than sales in the PC market. Nothing says that has to be the case. And I don't think AMD will be supply-constrained. Remember, while Nvidia has to share TSMC's 16nm fabs with Apple, AMD gets priority on GloFo (which doesn't have any other major customers anyway as far as I can tell).
 
Feb 19, 2009
10,457
10
76
AMD is more likely to eat some of the additional costs of going with interposers and HBM in future consoles rather than make more profit. Right now the important thing is getting it established and driving down costs.

A company like MS could obviously eat those costs easier but they'd want a good return on it.

These are GDDR5 based, so AMD just gives these guys the chips and they pay for the rest.

Future consoles with HBM2, the APU/SOC will just go up in price since they don't have to pay for vram separately.

AMD has no need to eat any costs, it's pure margins going forward since they've locked all the major console players into x86/GCN for backwards compatibility.

ie, when you don't have a market and want to be the status quo, you can sacrifice margins for it. When you're the status quo and dominating, it's time to milk.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
How is this relevant? Fiji with 4GB HBM1 @ 28nm achieves a TDP of 175W while the entire PS4 uses 130-150W while gaming.

This is the problem right here. These 2 factors almost completely eliminate the possibility of a 28nm GCN part.

28nm = worse perf/watt, massive die size

4GB HBM1 = more expensive and complex than 8GB GDDR5 (390/390X).

Cut-down 2304 shader Fiji makes no sense. Now we are talking about making a custom 2304 shader 28nm HBM1 chip specifically for PS4.5. Why? Polaris 10 was already made for PC. Take that chip, fuse off some parts to increase yields from a full 2560 shader Polaris 10, and just make it into a 14nm APU.

14nm also allows raising Jaguar clocks 31% without any perf/watt penalty.

Besides, sooner or later Sony/MS will want to minimize costs which ultimately means moving down to lower nodes over the manufacturing life of current gen consoles (this has happened every console gen). That means might as well jump straight on 14nm GCN 4.0, rather than design a new custom 28nm GCN 1.1-1.2 2304 shader HBM part.

The kicker is you cannot have 8GB HBM1 right now so this comparison people are making wrt to Nano/Fury against 290X should not even be contemplated.

I find it at odds why so many people are trying to grasp at straws to keep consoles at 28nm when it's actually the opposite -- try to come up with reasons WHY Sony shouldn't just move on with the times and take full advantage of new GCN architecture (HDMI 2.0) and all of the perf/watt improvements it will bring.

Again, the 2.5X perf/watt claims include about 70% that's coming from 28nm -> 14nm shrink per AMD themselves. How is AMD going to magically pull a 28nm 2304 shader part for Sony when their R9 390 uses 250-280W!? :sneaky:

How? Polaris would be baked into the APU.

Suicidal? Maybe forward thinking is the correct response? Seems like at least on the PS4 side both the older and the newer architecture would be optimized for. Kind of sounds like a double whammy too me.

If we assume PS4 generation will last at least another 3 years after September-December 2016, then another 50-60 million consoles could be sold. If Sony prices PS4K aggressively, I bet a lot of consumers will pay the extra $100-150 for a way more powerful and 4K-media capable console. This would allow Sony to claim that their console is the most advanced (vs. the NX), and test the market for mid-cycle console upgrades (aka Nintendo 3DS -> New 3DS). If it succeeds, it's a good precedent for next gen's PS5-> PS5K. What concerns me more is that if PS4K/Neo sells well, it could actually delay the launch of PS5 to 2020-2021 from 2019 that I expect it. That would suck. Another aspect is that PS4K/Neo may sell well but when PS5 launches, I bet some gamers that bought 2013-2015 PS4 may decide to wait for a mid-cycle PS5K.

Then again, so many people complain that current gen consoles are under-powered and they buy $60 US launch games but then they still complain when Sony wants to bring out a higher end model? You cannot have it both ways. You either accept consoles for what they are and never compare them to $1000+ 1440-4K PCs or you allow for the evolution of a console concept (i.e., mid-gen hardware upgrades) if you were the type who ripped them apart on launch day for weak horsepower.

Just because the Neo may come out, it doesn't suddenly mean that PS4 is worthless. This point has been repeated in this thread. When GP1080/Vega/Big Pascal come out, does it mean that 970/980/290 are completely worthless? No, you will still be able to play games on those GPUs just not at the same settings/FPS.

I know a handful of console gamers who didn't buy a PS4/XB1 for 3 years since they were either too busy with other aspects of life or waited until the game library for those consoles built up. These gamers are prime candidates for getting a PS4 Neo from Q4 2016->Q4 2019. I know if I was considering purchasing a PS4 this year, I'd just wait for the Neo. I cannot be the only who who'd pay $100-150 for a more powerful console to have a better experience without needing to wait 3-4 more years for a PS5/XB2, etc.

What's ironic is that MS is who desperately needed an Xbox One Neo. If Sony releases the Neo, it will make Xbox One even more hopeless.
 
Last edited:

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
These are GDDR5 based, so AMD just gives these guys the chips and they pay for the rest.

Future consoles with HBM2, the APU/SOC will just go up in price since they don't have to pay for vram separately.

AMD has no need to eat any costs, it's pure margins going forward since they've locked all the major console players into x86/GCN for backwards compatibility.

ie, when you don't have a market and want to be the status quo, you can sacrifice margins for it. When you're the status quo and dominating, it's time to milk.

Pascal is already rumored to be more GCN like, I hope AMD can patent some of the elements of its design to make sure nvidia can't completely copy GCN going forward.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Polaris 10 PC SKU could well be $299 or below.

232mm2, my rough estimate is ~260 dies per 300mm wafer.

Assuming costs around $8K per wafer (pretty high estimate, double the cost of 28nm). Each chip is ~$30 in wafer costs.

Add yield and binnings (cut dies, 36/40 CUs), potentially $40 per chip to factor in for some chips that fail to make the cut at all.

If they sell Polaris chips at $100 to AIBs, that's >100% margins.

The jaguar cores are already a small portion of the APU, on 14nm FF, it would be tiny. Total APU size would be ~300mm2. Let's give it some room and say 320mm2.

Per wafer is ~177 dies. Since we have 36 CU, it's a harvested part. Say a total of 150 dies that make the harvest, the rest are junk.

Each chip costs $53 for AMD. How much do you think they can sell it for to Sony, Nintendo and later, MS?

Edit: Assume the new consoles carry a price premium, say, $499. GDDR5 is dirt cheap, as is the crap mechanical HDD they use. AMD can sell it for $200 and these guys would still have to agree. When one does, the others will be pressured to accept. What are their alternatives? Not to release a new console and be left behind? Go with ARM/other GPU and ruin their backwards compatible plans which means gamers are less likely to support the new consoles... nope, they have no choice. Go with it, jack up the console prices if they need to and get their marketing game going for 60 fps fluid gameplay or 4K/VR experience!

AMD played the low margins to secure the x86/GCN win in current consoles, but in the next consoles, it's about profits and margins as the ecosystem is locked in.

From my rough calculations, adding another 20 CUs to the PS4 die will add another ~110mm2. That would make the PS4K die at 28nm equal to almost 450mm2 or close to a Hawaii die.

At 450mm2 we can have almost 125 dies per wafer. Lets say we only take 110 good dies (binning/yields etc).

Now if each 28nm wafer is half of what 14nm wafer cost, we have $4000 / 110 dies = ~ $36 per die.

If PS4K is a 2016 product then i will go for a 28nm die.
 

Magee_MC

Senior member
Jan 18, 2010
217
13
81
Pascal is already rumored to be more GCN like, I hope AMD can patent some of the elements of its design to make sure nvidia can't completely copy GCN going forward.

I'm sure that they'll patent them, however, since NV and AMD have a cross-licensing agreement on patents, it won't be able to stop NV from making a more GCN like card.
 

Good_fella

Member
Feb 12, 2015
113
0
0
Pascal is already rumored to be more GCN like, I hope AMD can patent some of the elements of its design to make sure nvidia can't completely copy GCN going forward.

So AMD creating API designed for GCN. Forcing Nvidia to make architecture like GCN but turns of it's patented. Sounds like law suit.
 
Feb 19, 2009
10,457
10
76
From my rough calculations, adding another 20 CUs to the PS4 die will add another ~110mm2. That would make the PS4K die at 28nm equal to almost 450mm2 or close to a Hawaii die.

At 450mm2 we can have almost 125 dies per wafer. Lets say we only take 110 good dies (binning/yields etc).

Now if each 28nm wafer is half of what 14nm wafer cost, we have $4000 / 110 dies = ~ $36 per die.

If PS4K is a 2016 product then i will go for a 28nm die.

Since 14nm FF is 2.2x the density, if the entire die is 450mm2 on 28nm, it will be very small on 14nm FF, ~220mm2 no more. Cost per die would be ~$30. Power usage would drop significantly.

Thus, the 14nm APU will be a vastly better product that allows AMD to charge even more for it. What do you think MS/Sony/Nintendo would pay more for, a very power efficient and high performance 14nm APU or a hot and power hungry 28nm version? Going with a smaller and efficient APU on 14nm means they can reduce build costs, less cooling required, smaller PSU, overall slimmer product.

It makes no sense to release an outdated APU for next-gen consoles. When the original PS4/Xbone was out, it was on the cutting edge. The next will be the same.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
From my rough calculations, adding another 20 CUs to the PS4 die will add another ~110mm2. That would make the PS4K die at 28nm equal to almost 450mm2 or close to a Hawaii die.

At 450mm2 we can have almost 125 dies per wafer. Lets say we only take 110 good dies (binning/yields etc).

Now if each 28nm wafer is half of what 14nm wafer cost, we have $4000 / 110 dies = ~ $36 per die.

If PS4K is a 2016 product then i will go for a 28nm die.

A ~250W console is highly unlikely.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Well, im also counting on 28nm HDL, so again i believe IF PS4K is a 2016 product it could be made at 28nm and not 14nm/16nm.
 
Feb 19, 2009
10,457
10
76
Well, im also counting on 28nm HDL, so again i believe IF PS4K is a 2016 product it could be made at 28nm and not 14nm/16nm.

HDL is already used for GPUs, so there's no magic there.

28nm APU of those specs, would be huge and power hungry. That doesn't bode well for a slim console design.

Polaris is also designed for 14nm FF. It would cost AMD more to back port it to 28nm.

Without Polaris, they aren't going to get HDMI 2.0 or newer 4K decoders that these consoles have focused on. Importantly, without Polaris they aren't going to get the big improved performance, period, due to limited GDDR5 bandwidth they will need the enhanced memory compression. Re-designing GCN 1.1 to add these blocks would also cost AMD more $.

There's no logical scenario where these new high performance console APU is 28nm.

Rather, lets speculate about how much faster Polaris GCN will be vs GCN 1.1 or 1.2!
 

coercitiv

Diamond Member
Jan 24, 2014
6,403
12,864
136
Well, im also counting on 28nm HDL, so again i believe IF PS4K is a 2016 product it could be made at 28nm and not 14nm/16nm.
If you stick with the 28nm prediction, what power usage will the new PS4K have?
 

Pinstripe

Member
Jun 17, 2014
197
12
81
I can't believe people are arguing over the node process. Of course the PS4K will use 14nm Polaris. The whole Polaris architecture has been specifially designed for VR/HDR/HDMI2.0/Low-power. Using anything else would be suicide.
 
Last edited:
Feb 19, 2009
10,457
10
76
Rather than 28nm talk... Discuss something more interesting...

Can the Primitive Discard Accelerator improve performance as well as reduce bandwidth requirements? How?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yes, yes you are. You need to apply some of your logic.

PS4 is GCN 1.1, 176GB/s.

Tonga has 40% effective memory compression tech, this is from AMD's presentations.

Polaris has an improved memory compression tech. How much better? Let's say a modest gain, up to 50% effective.

It's ~226GB/s with 50% compression is equivalent to 452GB/s.

Other uarch changes also improve bandwidth efficiency, by going less back & forth to memory with better pre-fetch and cache etc.

That's called hype. Based on the excuse of Polaris limited bandwidth instead of accepting Vega will be the real deal.
 

tajoh111

Senior member
Mar 28, 2005
305
323
136
Polaris 10 PC SKU could well be $299 or below.

232mm2, my rough estimate is ~260 dies per 300mm wafer.

Assuming costs around $8K per wafer (pretty high estimate, double the cost of 28nm). Each chip is ~$30 in wafer costs.

Add yield and binnings (cut dies, 36/40 CUs), potentially $40 per chip to factor in for some chips that fail to make the cut at all.

If they sell Polaris chips at $100 to AIBs, that's >100% margins.

The jaguar cores are already a small portion of the APU, on 14nm FF, it would be tiny. Total APU size would be ~300mm2. Let's give it some room and say 320mm2.

Per wafer is ~177 dies. Since we have 36 CU, it's a harvested part. Say a total of 150 dies that make the harvest, the rest are junk.

Each chip costs $53 for AMD. How much do you think they can sell it for to Sony, Nintendo and later, MS?

Edit: Assume the new consoles carry a price premium, say, $499. GDDR5 is dirt cheap, as is the crap mechanical HDD they use. AMD can sell it for $200 and these guys would still have to agree. When one does, the others will be pressured to accept. What are their alternatives? Not to release a new console and be left behind? Go with ARM/other GPU and ruin their backwards compatible plans which means gamers are less likely to support the new consoles... nope, they have no choice. Go with it, jack up the console prices if they need to and get their marketing game going for 60 fps fluid gameplay or 4K/VR experience!

AMD played the low margins to secure the x86/GCN win in current consoles, but in the next consoles, it's about profits and margins as the ecosystem is locked in.

Yield are not close to that good. Double that price and your closer to the margins that these chip actually cost.

Remember AMD makes in the midteens to 20% margins on their consoles. A chip they sell for 100 dollars, thus it costs 80 bucks to make. And this takes into account that console chips do not come fully enabled.

If margins were that great, all the companies would be making GPUs and the console revenue would be making them hundreds of millions of dollars in net profit instead of the 40-65 milllion they see off of 450-600 million console apu sales. Yields are probaly more in the 50-60% stage this early on.

http://www.isine.com/DieYieldCalculator.html

Use this calculator and you will get a better idea of what chip costs to produce. It probably costs them from between 70-90 dollars to make a chip. Yields get dramatically worse the larger you make a chip as well.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
That's called hype. Based on the excuse of Polaris limited bandwidth instead of accepting Vega will be the real deal.

I bet Polaris will be to GCN what Maxwell is to Kepler when it comes to bandwidth needs. Increasing CU cache is paramount to reducing bandwidth dependency.

I predicted this being a reason days ago as to why the PS4 Neo wouldn't need twice as much memory bandwidth even with twice as much graphics compute. PS4 Neo is virtually guaranteed to be using Polaris.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |