Discussion Qualcomm Snapdragon Thread

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
What I fear is Qualcomm imposing a mandatory bundle with modem/Bluetooth etc to generate extra cash. They are totally capable IMO.
They said the modem is an optional M.2 card for OEMs.

But it seems the WiFi/BT chip, which is also an M.2 card, comes mandatorily bundled with the SoC.

Can't complain. Qualcomm WiFi/BT is pretty good, no?
 

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
It's a bit complicated situation.

When X Elite arrives in mid-2024, it will be the undisputed leader in mobile. So Qualcomm has the good reason to charge a premium.

But that will change in 4-6 months time, when Lunar Lake, Arrow Lake Mobile and Strix Point enter the market. So Qualcomm may aptly have to reduce prices.

You really don’t know qualcomm, do you?

I have a prediction:

The chip will not be as efficient as claimed, and it won’t perform as well as claimed. It will also be more expensive than the competition.

Given you can’t pair a GPU with it, anything requiring DX12U will be a no-go. (NVIDIA/AMD do not have public ARM windows drivers)

GPU performance will be abysmal, drivers will be buggy, etc.

For me to be wrong about all of that would mean Qualcomm invested heavily in this product, which would be a huge change for qualcomm because they usually only invest heavily in the lawyers protecting their IP.

Hey, I hope I am wrong, but I remember Qualcomm’s history including all the monopoly/patent abuse and false promises.

The issue with this particular instance is that Qualcomm can’t bully their way in, they have to compete. That competition won’t be the products they so readily compare themselves to, but rather Zen 5/Arrow Lake.
 
Reactions: Thibsie

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
You really don’t know qualcomm, do you?

I have a prediction:

The chip will not be as efficient as claimed, and it won’t perform as well as claimed. It will also be more expensive than the competition.
That's an ominous prediction indeed.
Given you can’t pair a GPU with it, anything requiring DX12U will be a no-go. (NVIDIA/AMD do not have public ARM windows drivers)

GPU performance will be abysmal, drivers will be buggy, etc.
Well the X Elite does have the PCIe lanes to support a dGPU. The question then is whether Qualcomm can work with Nvidia/AMD/Intel to get ARM drivers working for the dGPU.

Regarding the iGPU drivers, drivers are software and can bw improved over time by updates. The hardware in the silicon itself is sufficiently powerful, so the situation is not entirely hopeless.

For me to be wrong about all of that would mean Qualcomm invested heavily in this product, which would be a huge change for qualcomm because they usually only invest heavily in the lawyers protecting their IP.

Hey, I hope I am wrong, but I remember Qualcomm’s history including all the monopoly/patent abuse and false promises.
Christians Amon has championed the push to the PC ever since he became the CEO. It was he who oversaw the $1.4B acquisition of Nuvia. Who knows, perhaps he will ensure things are done properly.
The issue with this particular instance is that Qualcomm can’t bully their way in, they have to compete. That competition won’t be the products they so readily compare themselves to, but rather Zen 5/Arrow Lake.
And this raises questions about Qualcomm'a future roadmap beyond the X Elite SoC and Oryon Phoenix core.
 
Reactions: ikjadoon

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
I have a prediction:

The chip will not be as efficient as claimed, and it won’t perform as well as claimed. It will also be more expensive than the competition
I am not dismissing your prediction. It does have some merit. There are already some cracks showing in the early performance previews Qualcomm showed off. The CPU has no E-cores, and the GPU is consuming a disproportionately high amount of power compared to their mobile versions.
 

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
Thinking on this further: The GPU thing is going to be a hard sell. I don’t see any vendor working with Qualcomm. They would be competing with themselves essentially.

NVIDIA - wants to have the ability to pivot hard to the PC space if ARM takes off. Geforce gives them an edge, so I don’t see them wasting it on a competitor, even if it means more sales.

Intel/AMD - This is a no brainer, release ARM drivers and now you lose out on CPU sales.

We will see. At the very least it will be interesting to watch. “Premium ARM” for PCs (ignore the Mac) is something that IIRC has not been attempted before. Looking forward to see how it all plays out.
 
Reactions: Thibsie

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
Thinking on this further: The GPU thing is going to be a hard sell. I don’t see any vendor working with Qualcomm. They would be competing with themselves essentially.

NVIDIA - wants to have the ability to pivot hard to the PC space if ARM takes off. Geforce gives them an edge, so I don’t see them wasting it on a competitor, even if it means more sales.

Intel/AMD - This is a no brainer, release ARM drivers and now you lose out on CPU sales.

We will see. At the very least it will be interesting to watch. “Premium ARM” for PCs (ignore the Mac) is something that IIRC has not been attempted before. Looking forward to see how it all plays out.
In that case, Qualcomm will have to forge their own path, perhaps making big SoCs like M3 Max with huge iGPUs.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
NVIDIA - wants to have the ability to pivot hard to the PC space if ARM takes off. Geforce gives them an edge, so I don’t see them wasting it on a competitor, even if it means more sales.
Nvidia is certainly the biggest threat to Qualcomm's hegemony in Windows On ARM.
 

Nothingness

Diamond Member
Jul 3, 2013
3,075
2,072
136
We will see. At the very least it will be interesting to watch. “Premium ARM” for PCs (ignore the Mac) is something that IIRC has not been attempted before. Looking forward to see how it all plays out.
Yes that's the first time somewhat competitive Arm PCs will be made. But I'm very pessimistic about their success due to SW support for Windows; Microsoft isn't Apple and I'm not sure they did anything to enforce portage.

I agree about your GPU driver potential issue; but that will have an impact only if these chips end up in machines that support discrete GPU, which in turn will happen only if the laptops gain some market share.

Only talking about MS Windows. I think drivers for AMD and NVIDIA GPUs are available for Arm on Linux.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
I think you guys are looking at it from the wrong perspective.

The X Elite is clearly not intended for gamers or professionals (content creators, engineers etc...)

As a matter of fact, if you look at this video where Ian interviewed Gerard Williams;


Ian asks 'Who is this product intended for?' and Gerard replies, 'for everyday users'. I think that's very interesting that he says that.

Of course, 'everyday users' are not gamers or professionals, who need a dGPU to do their work.

Qualcomm understands their weaknesses in ARM software compatibility and the GPU situation, which is why they are going after everyday users. In these circumstances, to market the X Elite to gamers/professionals would be a blunder.

And that's not a problem is it? Everyday users make up a sizable proportion of laptop users. I haven't seen the numbers, but I am willing to bet less than 33% of laptops have dGPUs.

Qualcomm has to start somewhere. Remember, WoA has <1% marketshare. They need to increase this marketshare, They need to grow a userbase. Eventually with time, the ARM software compatibility situation and the GPU driver situation will improve.
 
Reactions: Tlh97 and ikjadoon

ikjadoon

Senior member
Sep 4, 2006
221
482
146
Huawei, probably Phytium, Nvidia (mainly for older products like TX2 and Xavier, but still shipping), Fujitsu as you mentioned, Marvell (two different kinds of custom ARM64 cores with two different origins.) Not sure offhand who else. It's a sparser field than it used to be.

A longer tail than I remember even. Sad that most didn't get the market traction to sustain longer.

//

The existence of the Snapdragon X Plus is great news. It means we will get much more affordable laptops with Snapragon X processors.

That is the most exciting for me. I do think Qualcomm actually may be OK to undercut AMD & Intel with more performance, more battery life, and a lower price.

We saw Apple do that with the M1 MacBook Air: Apple, which is probably one of the easier company to jack up prices, had that chance, too. That ridiculously good laptop was $749.99 constantly for months.

With Snapdragon X Plus, I desperately pray Qualcomm sees the merit of "cheaper & higher perf / W" vs AMD & Intel laptops like the M1 MBA: a gateway drug for people to accept WoA, no dGPU, perhaps pricier RAM upgrades (due to the high speeds), etc.

If QC can hook people on crazy good battery life and fanless yet still very performant, I think it'll work out.
A 12C P-core CPU just seems too extreme for most consumers when there isn't a motivating reason for mainstream users to use 12-cores (esp when you already have an NPU).

Thinking on this further: The GPU thing is going to be a hard sell. I don’t see any vendor working with Qualcomm. They would be competing with themselves essentially.

NVIDIA - wants to have the ability to pivot hard to the PC space if ARM takes off. Geforce gives them an edge, so I don’t see them wasting it on a competitor, even if it means more sales.

Intel/AMD - This is a no brainer, release ARM drivers and now you lose out on CPU sales.

We will see. At the very least it will be interesting to watch. “Premium ARM” for PCs (ignore the Mac) is something that IIRC has not been attempted before. Looking forward to see how it all plays out.

On dGPUs: I do think it's all right if it doesn't work with any dGPUs tbh, as the internal iGPU seems pretty darn fast for creatives & light gaming (it does support DX12). QC's marketing slide shows pretty solid mainstream perf—this is 3DMark Wildlife Extreme, for reference.

Of course, it could be just like Intel's Arc: getting one benchmark right doesn't mean the games are truly optimized in the driver and we may be in a long year of "SXE GPU driver increases game perf by 200% (because the 1st driver sucked)."



For the dGPU-gaming crowd: I feel like this is a smaller part of the pie that QC doesn't necessarily need in the first go.

The dGPU notebook market itself is small: notebooks appear to sell as 25-35% dGPU vs 65-75% iGPU.

QC will have many headwinds, too: many AAA games will likely never get a WoA port. Lighter x86 games probably can do pretty playable in emulation, like this video shows, and users won't need a fast GPU to hit 100s of FPS:


I used to think people wouldn't accept 720p low as "gaming", but the Steam Deck sales & comments helped change my mind. Of course, one can at least attempt AAA games on the Steam Deck.

//

On the other hand, if NVIDIA & AMD truly enter the WoA market as Reuters claimed, then they'll be forced to create WoA GPU drivers (as they'll ship their own iGPUs or even dGPUs, too, we assume). I don't think they'll try to restrict those drivers to their own SoCs, but everyone does want their own walled garden these days.
 

Hitman928

Diamond Member
Apr 15, 2012
6,133
10,556
136
Interesting.

Yes, it's actually called reel or tape & reel packaging. It's meant to be handled by pick and place machines at the OEMs. If you don't need a full reel, there is cut tape packaging, which is basically just short lengths of reel. For a small amount of samples you might use something like gel pak.

Edit: the term packaging here is different than what is referred to as packaging when talking about assembling the IC onto packaging.
 

ikjadoon

Senior member
Sep 4, 2006
221
482
146
What is interesting is that it probably does use memory-on-package LPDDR5x. I admit I was wrong on that. It’s only one type of memory — LPDDR5x 8533 that they list and the targets for package dimensions or idle power gain from this. Which makes sense given their markets.

We’ll see if they go nuts on the marginal pricing there.

Is there a source yet? I'd be curious.

I believe 8533 MT/s is one of the original LP5x JEDEC speeds, so it ideally possible off-package, too. AMD was validating LPDDR5x-8500 back in mid-2022, too.

I'd like the efficiency benefits of on-package (it's all soldered for the most part anyways), but I'm curious to see where Qualcomm landed on this question.

I am not dismissing your prediction. It does have some merit. There are already some cracks showing in the early performance previews Qualcomm showed off. The CPU has no E-cores, and the GPU is consuming a disproportionately high amount of power compared to their mobile versions.

My only fear is (was?) Qualcomm trying to one-up Intel / AMD / Apple with needless clock speeds on 1T, but with Snapdragon X "Plus" allegedly coming, I pray that won't happen.

The AnandTech chart showed that 1T 4.3 GHz Oryon was virtually confirmed to thermal throttle even on a "normal fan curve" in a laptop:



9% loss when Geekbench 1T itself is rather consistent between OSes (GB6.1 comparison; GB6 comparison), so I'd say virtually all 9% was lost to throttling:






As for the Linux Geekbench results, because Qualcomm does not yet have fan control working under Linux, these systems were running with their fans on full blast. Whereas the Windows systems were running with more typical fan ramp curves, and thus didn’t enjoy the Linux laptops’ effectively unlimited thermal environment. Regardless, the primary purpose of the Linux demo was to showcase that Linux was working on the Snapdragon Elite X as well – that it’s not just for Windows – as Qualcomm has aims of getting the SoC into Linux laptops as well.

Qualcomm should be reasonable especially in the "Plus" variants: the 4 GHz of the slower X Elite variant or even lower is honestly fantastic for everyday users. Oryon's perf / Ghz is so high already.

Qualcomm isn't one to wildly OC their parts—they aren't Intel, lol, but I feel the Qualcomm vs Apple ego battle may well rival the Intel vs AMD ego battle in perf.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
I'd like the efficiency benefits of on-package (it's all soldered for the most part anyways), but I'm curious to see where Qualcomm landed on this question.
I remember a Qualcomm rep said something alluding to this. I actually posted that in this thread (or the former Nuvia thread, not sure). But I can't find it.
My only fear is (was?) Qualcomm trying to one-up Intel / AMD / Apple with needless clock speeds on 1T, but with Snapdragon X "Plus" allegedly coming, I pray that won't happen.
I am not a fan of Qualcomm gimping the ST performance in the lower tier chips like Intel/AMD do. I wish they stick to the Apple technique of providing same ST performance across the entire series.
The AnandTech chart showed that 1T 4.3 GHz
I am concerned about these ARM cores pushing 4 GHz+
Qualcomm isn't one to wildly OC their parts—they aren't Intel, lol, but I feel the Qualcomm vs Apple ego battle may well rival the Intel vs AMD ego battle in perf.
Ah yes, Apple vs Qualcomm.

The battle we needed but didn't ask for.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106

First A18 leak is here, and it puts Oryon in a bad light.

Oryon CPU in 8G4 does ~2800 in GB6 ST.

If this A18 leak is true, then A18 will have a comprehensive 25% ST performance advantage over 8G4 Oryon.

What's the use of the expensive Oryon custom core project if they are not able to atleast come close to Apple's ST performance?
 

gdansk

Platinum Member
Feb 8, 2011
2,896
4,387
136
First A18 leak is here, and it puts Oryon in a bad light.

Oryon CPU in 8G4 does ~2800 in GB6 ST.

If this A18 leak is true, then A18 will have a comprehensive 25% ST performance advantage over 8G4 Oryon.

What's the use of the expensive Oryon custom core project if they are not able to atleast come close to Apple's ST performance?
I'd wait to cry when it actually happens.
As is right now they're already about 28-30% ahead with the A17 Pro. So it'd be a slight, relative improvement in that benchmark. Allegedly.
But X5 may have made Oryon irrelevant unless it really does have better efficiency.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
I'd wait to cry when it actually happens.
As is right now they're already about 28-30% ahead with the A17 Pro. So it'd be a slight, relative improvement in that benchmark. Allegedly.
But X5 may have made Oryon irrelevant unless it really does have better efficiency.
As per the rumour mill, the 8G4 uses the same Oryon Phoenix core as the X Elite, but clocked at 4.0 GHz. That figures from 2800 GB6 ST for 8G4.

The ominous implication of this is... there will be 0% IPC improvement 1 year later.

Allegedly the 2nd gen Oryon core codenamed "Pegasus" got delayed by 6 months for whatever reason.

That Pegasus better be packing 50% performance improvement for the 8G5 / X Elite Gen 2. Do you guys think that is possible?
 

Doug S

Platinum Member
Feb 8, 2020
2,759
4,698
136
View attachment 93009
First A18 leak is here, and it puts Oryon in a bad light.

Oryon CPU in 8G4 does ~2800 in GB6 ST.

If this A18 leak is true, then A18 will have a comprehensive 25% ST performance advantage over 8G4 Oryon.

What's the use of the expensive Oryon custom core project if they are not able to atleast come close to Apple's ST performance?

Leaks this far out are never going to be right. I'm sure Apple has some early A18s in hand by now, but it probably isn't the final stepping and they almost certainly won't have decided on its final clock speed. Even if that was a real A18 result, it is confused by the fact another stepping or two (or tweaks from TSMC since it would be from risk production) might help performance, but it may be installed in some sort of testbed case that has a fan or heatsink where it is clocked higher than the ones found in iPhone 16s will be.

It is also quite possible this guy (or whoever "leaked" this info to him) simply made up some numbers.
 

soresu

Diamond Member
Dec 19, 2014
3,214
2,491
136
As per the rumour mill, the 8G4 uses the same Oryon Phoenix core as the X Elite, but clocked at 4.0 GHz. That figures from 2800 GB6 ST for 8G4.

The ominous implication of this is... there will be 0% IPC improvement 1 year later.

Allegedly the 2nd gen Oryon core codenamed "Pegasus" got delayed by 6 months for whatever reason.

That Pegasus better be packing 50% performance improvement for the 8G5 / X Elite Gen 2. Do you guys think that is possible?
Bruuuuh - hol' up a few months there.

We haven't even got Phoenix based products in the market yet.

That being said I reserve the right to laugh until the air pressure liquefies my lung tissue if Oryon ends up being another Kryo.

For certain things could get dicey for them if Cortex X5 turns out to be only as good of a jump over X4 as X1 was over A77.

I wonder what we can expect from A730/Chaberton in terms of perf/watt increase over A720.....
 
Reactions: Nothingness

SpudLobby

Senior member
May 18, 2022
980
673
106
Bruuuuh - hol' up a few months there.

We haven't even got Phoenix based products in the market yet.

That being said I reserve the right to laugh until the air pressure liquefies my lung tissue if Oryon ends up being another Kryo.

For certain things could get dicey for them if Cortex X5 turns out to be only as good of a jump over X4 as X1 was over A77.

I wonder what we can expect from A730/Chaberton in terms of perf/watt increase over A720.....
Lol yeah it would be funny if the X5 comes around and makes this endeavor look less impressive but it won’t be a Kyro case because that was actually meaningfully inferior on power/performance both and at more area, albeit Qualcomm still wouldn’t spend extra on SRAM and logic in even a directionally similar way to how Apple would - it was just like Arm’s stuff but worse and with some extra transistors.

Then Qualcomm had no real littles and I suspect that won’t quite be the case this time — because they have a much better design to work from and can afford to spend a bit on L1/L2.

It’s also the Apple engineers who I doubt will leave anytime soon, so we’ll see.

That being said: I agree there’s a funny element to it if Arm decides to start charging more, taking the gloves off and building better, bigger cores + mandating more cache, and functionally the power and performance curves of their future X and A7x cores in MediaTek or Samsung chips are barely different to Qualcomm yeah.
 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
Qualcomm’s CEO Cristiano Amon tipped a potential launch date while discussing the company’s own upcoming product, the Snapdragon X Elite, a new Arm-based chip for PCs that promises to shake up the laptop market.

Windows 12 wasn't mentioned by name. But in an earnings call, Amon said the Snapdragon X Elite is slated to arrive in “mid-2024” when Microsoft unleashes the "next version of Windows."

“We're tracking to the launch of products with this chipset tied with the next version of Microsoft Windows that has a lot of the Windows AI capabilities. We're still maintaining the same date, which is driven by Windows, which is mid-2024, getting ready for back-to-school,” he said.
Getting ready for the Snapdragon X Elite

One thing that's interesting is that Hudson Valley will be based on the Germanium platform release, and that platform update is set to hit RTM in April. However, Hudson Valley itself won't RTM until August, with a general release in September or October. The platform changes have more to do with the underlying tech, and it looks like Microsoft wants to have it ready earlier so that Arm devices powered by the Snapdragon X Elite can ship with it preinstalled.

Indeed, it's said that the Germanium platform has important changes for the Snapdragon X Elite and these PCs can't be shipped with the current version of Windows 11, but manufacturers want to ship them in June 2024. As such, these Arm-based PCs will ship with the Germanium platform release, but they won't have all the Hudson Valley features out of the box. They'll have to wait for the update coming a few months later, but it will simply be a cumulative update. For everyone else, Hudson Valley will release alongside the Germanium platform release as one big feature update, like Windows 11 was to Windows 10.
 

John Bruno

Junior Member
Oct 22, 2023
11
47
51
Is there a source yet? I'd be curious.

I believe 8533 MT/s is one of the original LP5x JEDEC speeds, so it ideally possible off-package, too. AMD was validating LPDDR5x-8500 back in mid-2022, too.

I'd like the efficiency benefits of on-package (it's all soldered for the most part anyways), but I'm curious to see where Qualcomm landed on this question.



My only fear is (was?) Qualcomm trying to one-up Intel / AMD / Apple with needless clock speeds on 1T, but with Snapdragon X "Plus" allegedly coming, I pray that won't happen.

The AnandTech chart showed that 1T 4.3 GHz Oryon was virtually confirmed to thermal throttle even on a "normal fan curve" in a laptop:

View attachment 92988

9% loss when Geekbench 1T itself is rather consistent between OSes (GB6.1 comparison; GB6 comparison), so I'd say virtually all 9% was lost to throttling:



View attachment 92993
View attachment 92994



Qualcomm should be reasonable especially in the "Plus" variants: the 4 GHz of the slower X Elite variant or even lower is honestly fantastic for everyday users. Oryon's perf / Ghz is so high already.

Qualcomm isn't one to wildly OC their parts—they aren't Intel, lol, but I feel the Qualcomm vs Apple ego battle may well rival the Intel vs AMD ego battle in perf.

This chart was disclosed at Snapdragon Summit and posted here on Anandtech:

https://images.anandtech.com/doci/21105/Snapdragon X Elite Pre-Briefing Deck 10.jpeg

It shows the power consumption of all 12 cores running Geekbench v6.2. Single core is no where near thermal throttling in any of the devices demonstrated at Snapdragon Summit.

 

FlameTail

Diamond Member
Dec 15, 2021
3,896
2,324
106
Hello @John Bruno

I have some suggestions for the naming of future Snapdragon X series processors.

(1) Do not use the "Gen" suffix. I believe it is not necessary. For example, there could be the choice of:

A: Snapdragon X Elite Gen 2
B: Snapdragon X2 Elite

I much prefer the latter. It looks a lot more simple and elegant. I think many will agree with me.

(2) We are aware from leaks that other tiers of processors are coming, such as "Snapdragon X Plus", in addition to Snapdragon X Elite. I like this naming system with suffixes like "Plus" and "Elite". It's simple and customer-friendly.

So I suggest you also consider the "Ultimate" suffix for future Snapdragon X processors.

Snapdragon X2 Ultimate

That sounds epic! 'Ultimate' is a lot better than 'Ultra', which is overused these days in tech.
______________

I hope you appreciate this feedback!
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |