Intel Broadwell Thread

Page 157 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Mar 10, 2006
11,715
2,012
126
So that same shlub (not an enthusiast) is going to dish out $375-$400 (guessing) for a cpu with 6 cores and eDRAM? I am just asking. I am thinking this is a great product idea for people on these forums (6 cores + eDRAM), but the market may not be there for the price they would have to charge.

Lets assume the following price structure:

7700K = $350
Kabylake-X = $400-$450
Skylake-X (6) = $500-$550

Where would a this Coffeelake sku land?

It is exceedingly unlikely that Skylake-X 6 core will come in at $500+ considering where the 6800K and the 5820K before it sat.
 

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
It is exceedingly unlikely that Skylake-X 6 core will come in at $500+ considering where the 6800K and the 5820K before it sat.

Ok, if it comes in at $400-$450, that just adds less room for a 6 core eDRAM desktop cpu in the lineup. That is the only point I was trying to make.
 

coercitiv

Diamond Member
Jan 24, 2014
6,400
12,852
136
If anything, eDRAM will come to desktops because Xeon chips that the HEDT chips are derived from will start to use them. Broadwell-C probably only existed because it was a derivative of something they'd have done anyway - Laptop chip meant for Apple with a beefy iGPU.

Also from the benchmark only games are showing the difference. I think people are exaggerating that it'll make such a difference that it'll revive the PC desktop sales. Or something.
eDRAM also makes a lot of sense in mobile and SFF, where you have the exact scenario PurePC tested - DDR4 2133 coupled with a variety of CPUs, including powerful quads.

If I were permitted to answer Arachnotronic's question from another thread, as "couch CEO" of Intel I would have pushed hard to make eDRAM an inherent part of their mobile line: for example having all i7 mobile CPUs come with eDRAM would have helped this tech establish a clear foothold in the market, help hide the extra cost in products with high margins, and ironically also help better differentiate between their low power i5 and i7 lines, where even reviewers have a hard time justifying the faster and more expensive models.

In another ironic twist, the gaming computers with powerful dGPUs and quad core CPUs would benefit tremendously from eDRAM: as we've seen from "desktop" Broadwell, in a memory limited scenario (DDR4 2133) the chip with eDRAM is capable of bringing a performance uplift equal to a clock increase of 500Mhz+. For a mobile CPU this is huge, being able to drop frequency by 500Mhz and get equal or better performance means you just turned your CPU from a 45W TDP into a 35W TDP with no performance penalty (in games). That's 10W you can redirect towards the GPU for better framerates, which often means increasing power budget for dGPU by 20-30%.

But I guess aggressively pushing the eDRAM requires spending extra money in a market where they already dominate, and that makes little business sense. I mean, who ever said that only the paranoid survive must have been joking, right?
 
Reactions: Drazick

DrMrLordX

Lifer
Apr 27, 2000
21,808
11,164
136
So that same shlub (not an enthusiast) is going to dish out $375-$400 (guessing) for a cpu with 6 cores and eDRAM?

Yes, especially if they were in the market for something like the 6700k or 7700k when they were new (and refused due to lackluster performance gains). Take a look at what the 5775c can do in games with an eDRAM victim cache. The fact that a stock 5775c beats the 7700k in anything is quite extraordinary.

I mean, who ever said that only the paranoid survive must have been joking, right?

Intel is in a constant competition with itself. That is a competition they can lose if they fail to offer compelling upgrade options for consumers, which has been a persistent problem on the CPU front since Sandy Bridge. Not to speak of what those other people are doing . . .
 
Aug 11, 2008
10,451
642
126
eDRAM also makes a lot of sense in mobile and SFF, where you have the exact scenario PurePC tested - DDR4 2133 coupled with a variety of CPUs, including powerful quads.

If I were permitted to answer Arachnotronic's question from another thread, as "couch CEO" of Intel I would have pushed hard to make eDRAM an inherent part of their mobile line: for example having all i7 mobile CPUs come with eDRAM would have helped this tech establish a clear foothold in the market, help hide the extra cost in products with high margins, and ironically also help better differentiate between their low power i5 and i7 lines, where even reviewers have a hard time justifying the faster and more expensive models.

In another ironic twist, the gaming computers with powerful dGPUs and quad core CPUs would benefit tremendously from eDRAM: as we've seen from "desktop" Broadwell, in a memory limited scenario (DDR4 2133) the chip with eDRAM is capable of bringing a performance uplift equal to a clock increase of 500Mhz+. For a mobile CPU this is huge, being able to drop frequency by 500Mhz and get equal or better performance means you just turned your CPU from a 45W TDP into a 35W TDP with no performance penalty (in games). That's 10W you can redirect towards the GPU for better framerates, which often means increasing power budget for dGPU by 20-30%.

But I guess aggressively pushing the eDRAM requires spending extra money in a market where they already dominate, and that makes little business sense. I mean, who ever said that only the paranoid survive must have been joking, right?
Yea, I agree. I said also a long time ago that Intel should make edram standard on quad mobile cpus. But you are right, it would make a nice point of differentiation for the dual core i7 as well.
 
Reactions: Drazick

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
eDRAM also makes a lot of sense in mobile and SFF, where you have the exact scenario PurePC tested - DDR4 2133 coupled with a variety of CPUs, including powerful quads.

and ironically also help better differentiate between their low power i5 and i7 lines, where even reviewers have a hard time justifying the faster and more expensive models.

Most people do not think as the reviewers do. Most are focused on Tablets and Smartphones and greater number of those that buy PCs are in the mindset that i7 is better than i5, no question. The marketing tactic worked very well as proven in 2010 and 2011.

It's just that... faster PCs aren't needed. No matter how good it may be.

In another ironic twist, the gaming computers with powerful dGPUs and quad core CPUs would benefit tremendously from eDRAM: as we've seen from "desktop" Broadwell, in a memory limited scenario (DDR4 2133) the chip with eDRAM is capable of bringing a performance uplift equal to a clock increase of 500Mhz+. For a mobile CPU this is huge, being able to drop frequency by 500Mhz and get equal or better performance means you just turned your CPU from a 45W TDP into a 35W TDP with no performance penalty (in games). That's 10W you can redirect towards the GPU for better framerates, which often means increasing power budget for dGPU by 20-30%.

Why do you think this is so? The very benchmark you quote shows minimal increases apart from gaming scenarios.

You also weren't keeping close track of Iris Pro parts have you?

First Iris Pro laptop based on Haswell:
-Comparable battery life to discrete GPU parts in everyday scenarios, when the discrete parts were enabled
-In reality, the discrete GPU parts were better in battery life because it switched to lower power iGPU. No idle power savings by eDRAM
-The laptops were just as expensive as those with discrete graphics
-Battery life in load wasn't better than discrete. Meaning there was no TDP advantage

With Broadwell, no one used it for laptops. No one. And they all did exactly you quoted. It had lower base frequency. The dual core Iris parts also show 1 or 2 hour reduced battery life.

Again, the successor to Broadwell-C was likely cancelled because the markets simply were not interested in such a product. Gamers included.

But I guess aggressively pushing the eDRAM requires spending extra money in a market where they already dominate, and that makes little business sense. I mean, who ever said that only the paranoid survive must have been joking, right?

Actually this is quite natural. In the world of "freemium" apps that apply every kind of psychological strategies to separate consumers and their money, do you expect rest of the business world to be different otherwise?
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,400
12,852
136
Why do you think this is so? The very benchmark you quote shows minimal increases apart from gaming scenarios.
And I was talking about gaming laptops.

You also weren't keeping close track of Iris Pro parts have you?
I'm actually ignoring Iris Pro since it's unable to compete with modern low power dGPUs. I've seen some of your thoughts on the matter, and I completely agree with you. That's why I said "ironic twist" in my post, the tech that was supposed to help Iris compete with the dGPU could actually help the CPU + dCPU just as much. (in a different way ofc)

First Iris Pro laptop based on Haswell:
-Comparable battery life to discrete GPU parts in everyday scenarios, when the discrete parts were enabled
-In reality, the discrete GPU parts were better in battery life because it switched to lower power iGPU. No idle power savings by eDRAM
-The laptops were just as expensive as those with discrete graphics
-Battery life in load wasn't better than discrete. Meaning there was no TDP advantage
You seem to be under the impression I am advocating the Iris Pro iGPU. I am not, not in it's current state at least. And not for the foreseeable future either, unless we learn of a sudden and ample shift in strategy from Intel.

With Broadwell, no one used it for laptops. No one. And they all did exactly you quoted. It had lower base frequency. The dual core Iris parts also show 1 or 2 hour reduced battery life.
No one used the quad core Broadwell mobile chips launched in June 2015 in order to use the quad core Skylake chips launched 3 months later. Imagine launching a product that becomes obsolete on it's way to the stores. And just for fun, the 5700HQ had higher base clocks than 6700HQ. No eDRAM either, yet nobody used it.

Again, the successor to Broadwell-C was likely cancelled because the markets simply were not interested in such a product. Gamers included.
Again, you seem to under the impression I am advocating for some other product. The eDRAM equiped mobile quad Skylake CPU is alive and well. Meanwhile, using eDRAM in 65-95W TDP CPUs makes little sense when the different form factor allows for a completely different memory configuration, minimizing the benefit of the L4 cache.

Actually this is quite natural. In the world of "freemium" apps that apply every kind of psychological strategies to separate consumers and their money, do you expect rest of the business world to be different otherwise?
You mean the world of big corporations that spend billions on startups with potential just to minimize the chance that some innovation ends up making them obsolete? A world in which they try to diversify their portfolio so that in the improbable but eventually inescapable moment when competition finally overcomes them, they still have a healthy business ecosystem behind to fund or alternatively help re-imagine their core offering?

Yes, I do expect them to be different than "freemium" apps companies that appear and die by the dozen a day. Long lives don't come cheap.
 
Reactions: Drazick

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
And I was talking about gaming laptops.

You really think Intel would develop such a complex packaging technology required for eDRAM just for such a niche market? More on that later.


I'm actually ignoring Iris Pro since it's unable to compete with modern low power dGPUs. I've seen some of your thoughts on the matter, and I completely agree with you. That's why I said "ironic twist" in my post, the tech that was supposed to help Iris compete with the dGPU could actually help the CPU + dCPU just as much. (in a different way ofc)

You seem to be under the impression I am advocating the Iris Pro iGPU. I am not, not in it's current state at least. And not for the foreseeable future either, unless we learn of a sudden and ample shift in strategy from Intel.

Not just Iris Pro. For Intel I assume you need enough justification on multiple markets for the project to continue to develop. I assume for Iris Pro parts it was: Enthusiasts for the CPU + Iris Pro to replace discrete + Satisfy Apple. None of them came true satisfactorily. I bet for a company the size of Intel those 3 areas succeeding would barely justify its development. It would still be merely a project.


No one used the quad core Broadwell mobile chips launched in June 2015 in order to use the quad core Skylake chips launched 3 months later. Imagine launching a product that becomes obsolete on it's way to the stores. And just for fun, the 5700HQ had higher base clocks than 6700HQ. No eDRAM either, yet nobody used it.

Again, you seem to under the impression I am advocating for some other product. The eDRAM equiped mobile quad Skylake CPU is alive and well. Meanwhile, using eDRAM in 65-95W TDP CPUs makes little sense when the different form factor allows for a completely different memory configuration, minimizing the benefit of the L4 cache.

No one aside from Intel's mediocre Skull Canyon NUC uses the GT4e Skylake. That's a failure worse than Broadwell's GT3e version.

I am referring back to Iris Pro parts because reviews showed that it was too expensive, and the battery life was not competitive, something that was entirely contrary to what iGPUs were. Yes, if you had switchable graphics on a discrete GPU laptop and the switchable graphics weren't working there's your Iris Pro laptop battery life while running on the Iris Pro. And the two were just as expensive.

And the fault of being expensive and lower battery life is entirely due to having eDRAM on there.

 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,400
12,852
136
No one aside from Intel's mediocre Skull Canyon NUC uses the GT4e Skylake. That's a failure worse than Broadwell's GT3e version.
One post earlier it was cancelled, now it's a failure existing in just one product. I understand you're trying to prove me wrong, but let's not trample on facts in the process: even with Iris quads being cancelled with Kaby Lake, the eDRAM is still alive and well in other products from Intel's portfolio.

I am referring back to Iris Pro parts because reviews showed that it was too expensive, and the battery life was not competitive, something that was entirely contrary to what iGPUs were.
Battery life was not competitive compared to what? Here we have a Dell XPS 13 laptop with Skylake chips in two reviews: one with eDRAM and one without, with the rest of components as similar as one can find in such circumstances. Battery life is almost identical.

Yes, if you had switchable graphics on a discrete GPU laptop and the switchable graphics weren't working there's your Iris Pro laptop battery life while running on the Iris Pro.
This sentence makes no sense to me.

And the fault of being expensive and lower battery life is entirely due to having eDRAM on there.
At this point, considering the Dell XPS 13 example given above, I'd like to see your info on how eDRAM negatively affects battery life in Skylake systems.

I would also like to remind you that chips using eDRAM also come with bigger iGPU, which increases die area and affects price. So blaming the extra price entirely on the L4 cache is a bit much.

So far my posts have been entirely concentrated on the positive effect eDRAM could have in memory constrained systems. It's a purely theoretical scenario imagining something that never actually happened: Intel using eDRAM as means to increase perf/watt in systems with low power / low performance memory. If you'd like to express you opinion on this matter, I'm most eager to listen, but otherwise I have little interest in discussing the efficiency of Intel's current GPU architecture and/or how that affects their pricing structure.
 
Reactions: Drazick

DrMrLordX

Lifer
Apr 27, 2000
21,808
11,164
136
I would think eDRAM l4 - especially with the improvements Intel made to their l4 in Skylake - would be beneficial to architectures like Skylake/Kabylake that appear to thrive on a diet of additional memory bandwidth.

The eDRAM itself was said by some people on this forum to cost as little as $3.

So why can't Intel tack that on to something like the 7700k or the future 6c/12t Coffeelake and rename the feature as "Gamebooster" or what have you? The PC gaming market is growing, and plenty of people would be willing to pay an extra $25 for a chip that . . . you know . . . makes games run faster?
 
Reactions: VirtualLarry

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Regarding Broadwell-C vs Skylake-S/Kaby Lake-S in modern titles, take a look at the latest review from Hardware Canucks:







http://www.hardwarecanucks.com/foru...ws/74538-intel-kaby-lake-i3-7350k-review.html

Still far from optimal for the LGA 1151 platform, but better than DDR4-2133. You will find titles/apps where Core i7-5775C beats a much higher clocked i7-4790K, but generally the latest chips are still faster at stock.
 
Reactions: Drazick

crashtech

Lifer
Jan 4, 2013
10,554
2,138
146
The 5775C's lower than customary stock clocks put it at a disadvantage in non-overclocking tests, imo.
 

Ansau

Member
Oct 15, 2015
40
20
81
Regarding Broadwell-C vs Skylake-S/Kaby Lake-S in modern titles, take a look at the latest review from Hardware Canucks:







http://www.hardwarecanucks.com/foru...ws/74538-intel-kaby-lake-i3-7350k-review.html

Still far from optimal for the LGA 1151 platform, but better than DDR4-2133. You will find titles/apps where Core i7-5775C beats a much higher clocked i7-4790K, but generally the latest chips are still faster at stock.

Well, these benchmarks don't tell you the whole story. The 5775c runs at a quite low speed, 3.7GHz. While it doesn't clock very high, 4.2GHz is a decent relative overclock (4.55GHz for the 6700k) and on top of that there is the EDRAM going up to 2GHz.

Also, all DDR3 platforms use ram at 1866MHz CL11 while DDR4 platforms use a more adequate 2666MHz CL13. All ddr3 cpus are under performing because even 1600MHz CL9 performs better than the one they use, let alone 2400MHz CL10 which helps even the 5775c, like these Polish guys show: https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake

When everything is optimized the cpu is very capable to compete against mild overclocked 6700k.
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
because even 1600MHz CL9 performs better than the one they use, let alone 2400MHz CL10 which helps even the 5775c, like these Polish guys show: https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake
When everything is optimized the cpu is very capable to compete against mild overclocked 6700k.

Interesting to see that games still have at least some random data that isnt going to cache before the cpu needs it even with 128m l4 cache
Once again crysis 3 showing that its one of the few game that cares more about cpu performance than cache\ram performance

oh and +1 for wanting l4 cache on future desktop gaming cpu preferably without a igp to lower price\power but that wont happen
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Hopefully with the new cache arrangement the skylake-x won't need eDRAM because the overhauled cache system will take care of cashing memory requests just as well. I can't wait to get my hands on one of them...
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Hopefully with the new cache arrangement the skylake-x won't need eDRAM because the overhauled cache system will take care of cashing memory requests just as well. I can't wait to get my hands on one of them...

I think it would make an even bigger impact. eDRAM latency and BW stats are between L3 cache access and main memory access. So if L3 has gotten smaller, more evictions from it will happen and once date is needed , they would benefit more from large pool of eDRAM.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I think it would make an even bigger impact. eDRAM latency and BW stats are between L3 cache access and main memory access. So if L3 has gotten smaller, more evictions from it will happen and once date is needed , they would benefit more from large pool of eDRAM.
I'm sure the increased bandwidth of l3 cache and the bigger l2 will more than make up for the LLC smaller size if the new cache subsystem can't even outperform the old one why would Intel make the changes to cache subsystem in the fist place?
 
Reactions: Sweepr

ehume

Golden Member
Nov 6, 2009
1,511
73
91
I'm sure the increased bandwidth of l3 cache and the bigger l2 will more than make up for the LLC smaller size if the new cache subsystem can't even outperform the old one why would Intel make the changes to cache subsystem in the fist place?
Cheaper for Intel to make?
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
tr
Cheaper for Intel to make?
Is it? Skylake has 2.375MB per core compared to 2.5MB cache per core for haswell/broadwell so it's almost the same size but I think that the l2 cache in skylake takes more space per memory cell than l3 so it's not cheaper but more expensive.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
I'm sure the increased bandwidth of l3 cache and the bigger l2 will more than make up for the LLC smaller size if the new cache subsystem can't even outperform the old one why would Intel make the changes to cache subsystem in the fist place?

I think my post was about relative performance increase, not effects of Skylake-X/EP cache changes. But with inclusive L3, Skylake-EP leaves relatively little of "free" room on L3. So when some busy cores cause L3 lines evictions, they will get invalidated in L2 of cores that have them and they will take full memory access hit when they get around to access (IF/WHEN). That's where my claim about EDRAM helping relatively more than on current CPUs that have ton of "free" L3.

But of course i am looking forward to Intel retuning to large L2 caches, in the past Conroe/Penryn literally destroyed AMD K8 even without integrated memory controllers in them, any workload can benefit from extra 768KB of L2
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
One post earlier it was cancelled, now it's a failure existing in just one product. I understand you're trying to prove me wrong, but let's not trample on facts in the process: even with Iris quads being cancelled with Kaby Lake, the eDRAM is still alive and well in other products from Intel's portfolio.

Since most people refer to 5775C in comparisons, quad core chips being cancelled is a valid point. eDRAM is available in very limited products on the most expensive configuration.

Battery life was not competitive compared to what? Here we have a Dell XPS 13 laptop with Skylake chips in two reviews: one with eDRAM and one without, with the rest of components as similar as one can find in such circumstances. Battery life is almost identical.

Exactly. Here's another review: http://www.notebookcheck.net/Face-O...ore-i5-vs-Surface-Pro-4-Core-m3.156031.0.html

Your claim is that with eDRAM it can be noticeably better.

This sentence makes no sense to me.

Looks like you aren't familiar with quad core Iris parts. Here: http://www.notebookcheck.net/Review-Schenker-S413-Clevo-W740SU-Notebook.98313.0.html

And here: http://www.notebookcheck.net/MSI-GS30-Notebook-Review.142706.0.html

If you normalize for battery life the Iris Pro 5200 is about equal to far faster 960M in battery life or worse. And you still pay through the nose for it. And by looking at datasheets its shown why the battery life is not better. The OPI interface is said to consume 1W. The eDRAM adds 3W to TDP and 0.5-1W when in standby.

And what I mean by switchable graphics is better. Iris Pro uses more power than regular iGPUs and have no lower power option to fall back to. Discrete GPU options fall back to the low power iGPU.

I would also like to remind you that chips using eDRAM also come with bigger iGPU, which increases die area and affects price. So blaming the extra price entirely on the L4 cache is a bit much.

eDRAM is extremely low volume. You also need to add another package. Die sizes, while responsible for costs, do not ultimately determine the final pricing of the product. Product positioning and market appeal along with volumes do. Also there's a fixed cost increase by having it off-die. Which is why they to try to get it on-die when possible.

You also have similar efforts being done with HMC and HBM, being used in extremely expensive and high end products. Front page AT article about Micron developing GDDR6 despite HBM is an indication for the next few years, HBM, or any on package memory is slated to be expensive.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
PCGamesHardware

- Anno 2205
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.4 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 35.8 FPS (14% faster)

- AC: Syndicate
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 109.9 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.9 FPS (23.6% faster)

- Crysis 3
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 148.9 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 176.9 FPS (18.6% faster)

- Dragon Age Inquisition
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 112.3 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 135.6 FPS (20.7% faster)

- F1 2015
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 91.1 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 126.3 FPS (38.6% faster)

- Far Cry 4
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 76.8 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 89.3 FPS (16.2% faster)

- Starcraft 2
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 31.1 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 38.2 FPS (22.8% faster)

- The Witcher 3
Ryzen R7 1800X (8C/16T 3.6-4.0 GHz): 80.2 FPS
Core i7-6900K (8C/16T 3.2-3.7 GHz): 134.7 (68% faster)

www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033









 
Last edited:
Reactions: Drazick
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |