Intel keeps up the unethical SDP scam with “new” 4.5W parts [S|A]

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dawheat

Diamond Member
Sep 14, 2000
3,132
93
91
In fairness to Intel, it's a bit unfair for them to have to compare the TDP of their processors with competition from Qualcomm and Samsung where TDP isn't used and wouldn't be apples to apples.

So they define a new term that positions them better compared to those products. Is it biased? Of course - but that won't confuse OEMs today who will make their own conclusions. Does it matter to end consumers? Not in the slightest.

It does behoove Intel to come up with a credible and documented methodology so their products can be compared apples to apples with their upcoming compeition.

But scam, fraud? Please - it's about as much "fraud" as AMD marketing a 5Ghz CPU (which is not at all IMO).
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I find it strange that any end users here even care about this. These ARE NOT CPUs that you will be implementing in your own system. It's up to the mobile (tablets mostly) OEMs to properly implement these chips and ensure that the heat won't overwhelm the hardware in all uses. If the system does get hot, it will also have to be able to throttle. I specifically bring this up because it isn't on Intel to present these chips properly to consumers... it's up to the OEM. If the OEMs find them to much of a hassle, they shouldn't implement them. However, if Anand's recent article is any indication, these might work rather well.

Also, keep in mind that even ARM-based tablets can overheat. Here's an iPad overheated.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I find it strange that any end users here even care about this. These ARE NOT CPUs that you will be implementing in your own system. It's up to the mobile (tablets mostly) OEMs to properly implement these chips and ensure that the heat won't overwhelm the hardware in all uses. If the system does get hot, it will also have to be able to throttle.

OEMs do this regardless of TDP and have been doing this for years. The issue here is that it's now just an opportunity for Intel to market something that's been around for a while by making them official products with official 'SDPs' (whatever the hell that means)

When building a laptop or tablet, an OEM has the option of taking a chip and downclocking (and even setting the ceiling temperature) to prevent overheating. The chips are still binned at Y TDP - in this case it's 11.5W. The TDP hasn't changed, but because of the cooling and form factor, additional tweaking is required to get the product to fit.

This isn't a case of Intel introducing a new anything but marketing term to reflect the above, but because it's Intel's it has an Intel 'SDP'. Bear in mind this SDP means nothing until it's defined with respect to cooling (how much heat will the tablet have to cool?), power consumption (at full throttle it'll be 11.5W TDP unless it's downclocked/limited to a specific clock), the workload (how in the hell is SDP even determined?), and does it even include things like the RAM, WiFi, and whatever else they're missing from Haswell being a complete SoC?

Which leads us to a very simple question: 4.5W of what?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The funny thing here is that there is no universal definition of TDP, TDP numbers between two different manufactures cannot be compared. So even if intel used "TDP" they could state whatever the heck they want, which is what AMD seems to do. There is no universal definition of workload associated with TDP either.

But it isn't okay for intel to use "SDP" while it is okay for AMD to use "ACP". TDP has no universal definition anyway. So this thread exists...why?
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Why 2x? Why not 5x? That seems more realistic to me.

Consider a heavy graphic load on Haswell ULV (15W). The CPU cores might be down at 2-3W power draw leaving 12-13W for graphics.

With Haswell on tablet at 4.5W "SDP", the CPU cores might be down at 1-2W, leaving...2.5W for the graphics? Even if the CPU cores were consuming no power (impossible) it would still only leave 4.5W for the graphics.

At best Haswell will be 1/3rd to half as fast as S800, at worst it'll be...a lot worse - unplayable in many cases because there is a very tight limit to how much power the IGP can be given. Intel's graphics are not efficient and never have been - they need sky-high clocks to make up for the inherent architectural weakness, and those high clocks can't be maintained at low TDP.

There is no way a 11.5W TDP Haswell-Y (4.5-6W SDP) is going to perform 5x worse than a 15W TDP Haswell-U part, no matter how much you wish it does (even my 2x estimate is pessimistic). Snapdragon 800 is actually dangerously close to a 15W AMD Kabini A4 5000 in GPU performance, both losing to last years HD4000 ULV. Even 19W Trinity parts are not exactly graphics performance champions, so please stop pretending that they're miles behind the competition.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
The funny thing here is that there is no universal definition of TDP, TDP numbers between two different manufactures cannot be compared. So even if intel used "TDP" they could state whatever the heck they want, which is what AMD seems to do. There is no universal definition of workload associated with TDP either.

But it isn't okay for intel to use "SDP" while it is okay for AMD to use "ACP". TDP has no universal definition anyway. So this thread exists...why?

It isn't 'okay' for either one to use it. They're just looking to score brownie points among the idiotic enough to believe in the nonsense.

And that's why this thread exists - to point out the idiotic nonsense.

TDP dictates how much cooling a chip will require when running at full throttle. It won't go over that, and if it does then the chip maker screwed up. It's there so things like overheating don't occur and OEMs and system builders can build with specificity that's required when dealing with highly sensitive parts.

On the other hand, SDP is...

we'll get back to that, I guess...
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
OEMs do this regardless of TDP and have been doing this for years. The issue here is that it's now just an opportunity for Intel to market something that's been around for a while by making them official products with official 'SDPs' (whatever the hell that means)

Based on your wording, I think you might have missed my point. You said "an opportunity for Intel to market", and I tried to stress that it's the OEMs that will be using these chips not end-users. Usually people worry about marketing because it's used to fool ignorant consumers, but these are companies that hire people to (hopefully) ensure that a component will work in the end product. I think you kind of talk about that later on in your post, but I just don't like the term "marketing" being applied to this as it doesn't really make sense in this context.

So, really the only complaint that we have is if this turns into bupkis and no OEMs actually use this product. We got our hopes up for a x86-based super chip for tablets, and ended up still using ARM devices.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Again, there is no set in stone definition of TDP. Every manufacturer uses something different - TDP dictates how versatile a cooling system must be, and the maximum heat it must dissipate during the use of "real world applications". However, TDP DOES NOT EQUAL THE MAXIMUM POWER A CPU CAN DRAW. Every silicon manufacturer publishes a TDP figure based on their internal figures for real world application usage - it is never, without exception, based on maximum power. Thus every manufacturer can be honest or dishonest on what they consider "real applications". There are no universal "real applications" to be used for a TDP figure. Again, every manufacturer determines that on their own and _not based on maximum power consumed_. AMD's figures are not comparable to Intel. Intel's figures are not comparable to Qualcomm. Qualcomm figures are not directly comparable to nvidia's.

Which is why this thread is completely and utterly silly. I would suggest that intel is far more honest with their figures than AMD is, but then again...TDP is completely determined by the manufacturer. There is not and there will not be a universal definition of what constitutes a real use workload whether you want to use TDP, SDP, or AMD's SCP.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Based on your wording, I think you might have missed my point. You said "an opportunity for Intel to market", and I tried to stress that it's the OEMs that will be using these chips not end-users. Usually people worry about marketing because it's used to fool ignorant consumers, but these are companies that hire people to (hopefully) ensure that a component will work in the end product. I think you kind of talk about that later on in your post, but I just don't like the term "marketing" being applied to this as it doesn't really make sense in this context.

http://techreport.com/news/25126/intel-teases-4-5w-haswell-cpu-for-fanless-tablets
http://www.xbitlabs.com/news/cpu/di...lur_the_Line_Between_Laptops_and_Tablets.html
http://www.anandtech.com/show/7168/...-sdp-parts-in-limited-volumes-later-this-year

OEMs, eh? I'm guessing the press release was for them as well then?

However, TDP DOES NOT EQUAL THE MAXIMUM POWER A CPU CAN DRAW. Every silicon manufacturer publishes a TDP figure based on their internal figures for real world application usage - it is never, without exception, based on maximum power.

That's not what I said. If you're going to argue with me then you should at least read what I'm actually typing

TDP dictates how much cooling a chip will require when running at full throttle. It won't go over that, and if it does then the chip maker screwed up. It's there so things like overheating don't occur and OEMs and system builders can build with specificity that's required when dealing with highly sensitive parts.

Intel is definitely far more honest than AMD is with respect to TDP (see IDC's findings), but that doesn't make 'SDP' anything but a dishonest marketing ploy that's only there to reel in the idiots.

Your 35W Haswell laptop CPU can have a 4.5W SDP just like the 11.5W TDP chips can. What's the difference? Where's the line?
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
There is no way a 11.5W TDP Haswell-Y (4.5-6W SDP) is going to perform 5x worse than a 15W TDP Haswell-U part, no matter how much you wish it does (even my 2x estimate is pessimistic).

See, that "15W" Haswell ULV is actually more like 15W SDP too. It's just that most tech sites don't measure power draw while gaming.

When they do though, the results are somewhat...surprising.



17 Watts Intel ULV drawing 35W system power while the 15W AMD Kabini draws 20W (both with screen off). Hmmm.

Check out the Crysis 3 power draw for the i7 4750HQ (Iris Pro 5200)

80+ Watts for a 45 Watt cpu? That's almost as much as AMD's desktop Richland part.

Now you see why most sites don't show gaming power draw for Intel's ULV chips, right? It just doesn't add up to their TDP numbers. The reason? Turbo and benchmarketing. These graphics are using far more power than their 4.5W/15W/47W or whatever TDP is supposed to allow - for short periods of time that look much better in benchmarks,.

It's going to be much more difficult to do that in a tablet. Running 11.5W through a tablet for a gaming benchmark or two is going to give you some hot hands. The only question is will Anand call them out on it, or just keep toeing the line.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
17 Watts Intel ULV drawing 35W system power while the 15W AMD Kabini draws 20W. Hmmm.

Check out the Crysis 3 power draw for the i7 4750HQ (Iris Pro 5200)

80+ Watts for a 45 Watt cpu? That's almost as much as AMD's desktop Richland part.

Now you see why most sites don't show gaming power draw for Intel's ULV chips, right? It just doesn't add up to their TDP numbers. The reason? Turbo and benchmarketing. These graphics are drawing far more power than their 4.5W/15W/47W or whatever TDP is supposed to allow - for short periods of time that look much better in benchmarks,.

That's platform power and that's not TDP.

People are going overboard here and misusing terms and presenting benchmarks which don't add anything of value. What it boils down to is far more simple:

Scenario design power is only relevant if you specify exactly the scenario. Without it, the two words following mean utterly nothing.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
That's platform power and that's not TDP.

The AMD platform is drawing 20W with a 15W CPU while the Intel platform is drawing 35W with a "17W" CPU. I'm sure you can figure out the problem here.

No screen on either. Take your best shot at where the other 12-13W is being used on the Intel platform.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Erm, TDP isn't power consumption. Virtually every silicon on the planet draws more maximum power than the specified TDP figure - TDP describes how much heat a cooler must dissipate during applications loads determined by the manufacturer. It is not power consumption or maximum power consumption, so your comparison isn't correct. You can buy virtually any tablet on the market and you can exceed the listed TDP in terms of maximum power consumption...

That same ULV Ivy Bridge chip will throttle under excessive loads if the cooling system is not up to the task, just as all mobile chips do.
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Both SoCs too, adding a lot of the platform components capable of adding power consumption into them.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126

Who cares if there's a press release? How does that affect the price of tea in China? I get the feeling that you're just completely evading my point. This product will not exist by itself in the retail channel. So, I have yet to see the reason why it matters if "Intel is lying!" because we aren't the ones who have to deduce whether the "true heat dissipation" fits within our cooling capabilities! That's the job of the OEMs that will integrate it into their products, and we have to hope that they do a proper job testing the product to ensure that it will work for us when we get our grubby little hands on it.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
See, that "15W" Haswell ULV is actually more like 15W SDP too. It's just that most tech sites don't measure power draw while gaming.

When they do though, the results are somewhat...surprising.



17 Watts Intel ULV drawing 35W system power while the 15W AMD Kabini draws 20W (both with screen off). Hmmm.

Anand did a test on kabini and said that

1) Kabini seems to be sub 15 watts

I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W).

2) Ivy Bridge 17 watt TDP does not include the 3.6 watt PCH.

3) Different Platforms.

4) AMD's TDP numbers don't really make that much sense for kabini.

A4-5000 : 4 cores : 1.5 Ghz : 2 MB cache : 500 mhz GPU : 15 watt TDP
E1-2500 : 2 cores : 1.4 GHz : 1 MB cache : 400 mhz GPU : 15 watt TDP

A4-1200 and A4-1250 have a 3.9 vs 8 watt TDP despite the only difference being 225 mhz vs 300 mhz GPU clock speed.

5) Not sure how they are measuring power but possibly power adapter efficiency.

Check out the Crysis 3 power draw for the i7 4750HQ (Iris Pro 5200)

80+ Watts for a 45 Watt cpu? That's almost as much as AMD's desktop Richland part.

Now you see why most sites don't show gaming power draw for Intel's ULV chips, right? It just doesn't add up to their TDP numbers. The reason? Turbo and benchmarketing. These graphics are using far more power than their 4.5W/15W/47W or whatever TDP is supposed to allow - for short periods of time that look much better in benchmarks,.

It's going to be much more difficult to do that in a tablet. Running 11.5W through a tablet for a gaming benchmark or two is going to give you some hot hands. The only question is will Anand call them out on it, or just keep toeing the line.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Who cares if there's a press release? How does that affect the price of tea in China? I get the feeling that you're just completely evading my point. This product will not exist by itself in the retail channel. So, I have yet to see the reason why it matters if "Intel is lying!" because we aren't the ones who have to deduce whether the "true heat dissipation" fits within our cooling capabilities! That's the job of the OEMs that will integrate it into their products, and we have to hope that they do a proper job testing the product to ensure that it will work for us when we get our grubby little hands on it.

it is called advertising, where everyone goes to some uninformed site that spouts of random 4.5W numbers to lull people into thinking that haswell parts will have performance and battery life. oems might integrate them but consumers buy them and you'd be surprised of how many people hear a brand and take marketing hype as truth. Also it should always matter if a company is lying or skewing facts...remember fx 9590 5GHz scandal?
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Intel is definitely far more honest than AMD is with respect to TDP (see IDC's findings),

That's platform power and that's not TDP.

People are going overboard here and misusing terms and presenting benchmarks which don't add anything of value.

The same applies to the AMD TDP, what IDC measured was TOTAL system power draw at the wall. That has nothing to do with AMDs TDP value of 125W.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,224
1,598
136
Don't expect an honest answer. The trolls are strong with this one.

Distributed computing on a tablet or phone? What % of people do that? .000001% of mobile users?

Yes, I *HATE* battery life, I want my android or iphone to completely suck its battery dry in an hour, or have it sit there cooking while charging all night long, all for numbers that are pathetic compared to even a netbook. No thanks.

Yeah, lately the level of idiotic threads and post have been rising exponentially. That happens when someone has the time to post over 1000 times in just a few month.

I wonder what the Anti-Intel Club would have said if intel did not disclosed any measure at all for stuff below 10 W...like ARM does. They would probably say because the chips are bad and power hogs. I bet a Snapdragon 800 running something like furmark would easily consume 10w or more.

A reminder for all ARM fanboys:

http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13

This is a dual-core A15 which easily uses 4 W alone (and that is because it is throttled). Together with the GPU it would easily go up to 8W if it were not throttled. So a dual-core A15 with mali T-604 GPU has a TDP of about 8W but is throttled to 4W. Now a 4.5 W SDP 11 W TDP Haswell doesn't look that bad at all...

And this is the real reason why Intel-haters are whining about every little irrelevant detail intel does: They are "scared".

I'm getting tired of saying this, but I'm going to say it one more time: we don't want your meta commentary on other posters. Do you have a problem with another poster? Then report it. Otherwise keep your yap shut. This is a technical forum, not Facebook. So leave your gossip at home.

Big people talk about ideas
Average people talk about things
And small people talk about other people.

-ViRGE
 
Last edited by a moderator:
Mar 10, 2006
11,715
2,012
126
Yeah, lately the level of idiotic threads and post have been rising exponentially. That happens when someone has the time to post over 1000 times in just a few month.

I wonder what the Anti-Intel Club would have said if intel did not disclosed any measure at all for stuff below 10 W...like ARM does. They would probably say because the chips are bad and power hogs. I bet a Snapdragon 800 running something like furmark would easily consume 10w or more.

A reminder for all ARM fanboys:

http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13

This is a dual-core A15 which easily uses 4 W alone (and that is because it is throttled). Together with the GPU it would easily go up to 8W if it were not throttled. So a dual-core A15 with mali T-604 GPU has a TDP of about 8W but is throttled to 4W. Now a 4.5 W SDP 11 W TDP Haswell doesn't look that bad at all...

And this is the real reason why Intel-haters are whining about every little irrelevant detail intel does: They are "scared".

Yup.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Yeah, lately the level of idiotic threads and post have been rising exponentially. That happens when someone has the time to post over 1000 times in just a few month.

I wonder what the Anti-Intel Club would have said if intel did not disclosed any measure at all for stuff below 10 W...like ARM does. They would probably say because the chips are bad and power hogs. I bet a Snapdragon 800 running something like furmark would easily consume 10w or more.

A reminder for all ARM fanboys:

http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown/13

This is a dual-core A15 which easily uses 4 W alone (and that is because it is throttled). Together with the GPU it would easily go up to 8W if it were not throttled. So a dual-core A15 with mali T-604 GPU has a TDP of about 8W but is throttled to 4W. Now a 4.5 W SDP 11 W TDP Haswell doesn't look that bad at all...

And this is the real reason why Intel-haters are whining about every little irrelevant detail intel does: They are "scared".

well that is the trick isnt it. the whole a15 soc was doing heavy computation at around 8W power draw not TDP before being throttled to 4W power use. the intel 4.5W sdp is for a certain scenario near idle I would assume probably clocked at 800MHz. When they ramp up and start using alot more power, i'd say more than 20W http://www.notebookcheck.net/Review-Acer-Aspire-P3-171-3322Y2G06as-Convertible.96583.0.html
so they arent very comparable under load.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The problem is not the SDP, the problem is at what application/s they measured the SDP.
Every current low power CPU can run at 4W, the problem is what work it can do at 4W.

It is another thing to game at 4W and another thing to just browse at 4W.
It is another thing to game at 4W and produce 40fps and another thing to game at 4W and produce 10fps.

And because Intel doesn’t want to talk about it, it seems like a marketing BS than real life measurement. That makes people question the SDP validity and the reasons Intel is using it.
 

beginner99

Diamond Member
Jun 2, 2009
5,224
1,598
136
well that is the trick isnt it. the whole a15 soc was doing heavy computation at around 8W power draw not TDP before being throttled to 4W power use. the intel 4.5W sdp is for a certain scenario near idle I would assume probably clocked at 800MHz. When they ramp up and start using alot more power, i'd say more than 20W http://www.notebookcheck.net/Review-Acer-Aspire-P3-171-3322Y2G06as-Convertible.96583.0.html
so they arent very comparable under load.

Re-read it. It only went up to 8 W with a trick and then only for a very short time. In fact it is limited to 4 W.

And no, how you describe SDP is completely wrong and it has been stated several time in this thread. It's the "heat" the chip will produce under average tablet loads. In idle it will be even lower. And even your i7 45w normal laptop cpus clock down to 800 mhz when idling. So that is pretty standard. Assuming you have the according power plan selected.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Re-read it. It only went up to 8 W with a trick and then only for a very short time. In fact it is limited to 4 W.

And no, how you describe SDP is completely wrong and it has been stated several time in this thread. It's the "heat" the chip will produce under average tablet loads. In idle it will be even lower. And even your i7 45w normal laptop cpus clock down to 800 mhz when idling. So that is pretty standard. Assuming you have the according power plan selected.


In todays Intel CPUs, AMD APUS and ARM SoCs the power distribution is bidirectional between the CPU part and the iGPU. That is, if the application needs the CPU resources then the CPU Cores have all the available TDP for themselves. But when the application needs the iGPU (like in games) then the iGPU can take all the available TDP and raise its clocks in order to have better performance. If the application needed both the CPU and the iGPU 50%/50% then the TDP would be divided between the two.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
The problem is not the SDP, the problem is at what application/s they measured the SDP.
Every current low power CPU can run at 4W, the problem is what work it can do at 4W.

It is another thing to game at 4W and another thing to just browse at 4W.
It is another thing to game at 4W and produce 40fps and another thing to game at 4W and produce 10fps.

And because Intel doesn’t want to talk about it, it seems like a marketing BS than real life measurement. That makes people question the SDP validity and the reasons Intel is using it.

One of the new sites found that the claimed 4.5W figure is only valid when the chip is caped at 800MHz. I think this eliminates gaming from the equation and leaves only light browsing.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |