Iris pro benchmarks are in, and........they're VERY GOOD

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Good point. Once the population has completely been dumbed down and changed buying habits, this will surely happen. Convenience > enthusiasm/security.

IGPs also advance faster than dGPU. Specially when dGPU turns to 3 year node changes.

Its not about being dumbed down. Its simply about all the 50-75$ dGPUs not being sold. And soon 100$ cards too.

You can actually blame the dGPU makers, for trashing their value products to such poor performance.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
IGPs also advance faster than dGPU. Specially when dGPU turns to 3 year node changes.

Its not about being dumbed down. Its simply about all the 50-75$ dGPUs not being sold. And soon 100$ cards too.

I can agree on that but at the same time, it's a tricky situation for dGPU vendors since their R+D costs are relatively high and they can't recover that from budget cards alone.

Still, dGPU will be the way to go if someone wants to game and only game - for those folks, full size laptops weighting 15+ pounds will still exist. Ultrabooks are a different breed of machine designed for very long battery life and portability, whereas laptops completely eschew portability and battery life. For the latter category, a GT780M makes sense for someone gaming and only gaming. Essentially, Iris Pro is a compromise for an 8+ hour battery life system with extreme portability. Conversely, a full size laptop will have no battery life worth mentioning, will not be portable (it will be annoying to carry a 17 pound system around), and will not care about TDP. So with that being the case Iris Pro is such a good compromise between the two - it provides outstanding graphics performance for a mere 47W TDP.

That being said, we're getting closer to the point to where 1080p gaming can be possible on an integrated GPU - even with the most demanding games. Iris Pro doesn't quite hit that mark, but it doesn't need to - it is designed for super long battery life and portability. Anyway, I'm super excited about what GT4 Broadwell can do in this respect, intel is inching ever so much closer to having the best of both worlds.. GT3e/Iris Pro nails it for ultrabook/macbook form factors, although folks won't purely use these machines for gaming. Those who do game on those systems will likely dial down the resolution in order to do so.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Actually the results are pretty impressive, compared to HD4000 this is a huge leap.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Good point. Once the population has completely been dumbed down and changed buying habits, this will surely happen. Convenience > enthusiasm/security.

Nv optimus is working flawlessly in my xps machine. It is convinient and i dont think any users really care if its separate gpu or not.

I dont know how important battery life is when gaming on the midrange and up?

For the high end laptops i think we will have separate gpu for many years. There is a lot of economic sense of doing so; flexibility, the problem of producing huge dies in small volume if you integrate it all. Apus is for volume market imho.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Actually the results are pretty impressive, compared to HD4000 this is a huge leap.

Thats a little like saying a 7950 is huge leap from 7750.
The iris pro would be very interesting if the efficiency of the gpu had improved and the price was cut in half.
I think the most relevant comparison to the 4000 is the new 4400 even though its from more eu. I think the gpu this generation from Intel was quite unimpressive compared to the 3000 and 4000 series and my bet is the arch efficiency improvement we will see in broadwell will make that clear.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Thats a little like saying a 7950 is huge leap from 7750.
The iris pro would be very interesting if the efficiency of the gpu had improved and the price was cut in half.
I think the most relevant comparison to the 4000 is the new 4400 even though its from more eu. I think the gpu this generation from Intel was quite unimpressive compared to the 3000 and 4000 series and my bet is the arch efficiency improvement we will see in broadwell will make that clear.

Actually after larrabee I though I would write them off as a gpu manufacturer going forward, this is a nice turnaround.Intel is already way ahead in CPU segment, if they can pair this up with decent enough GPU NV and AMD has serious troubles ahead.Look at market share, Intel's share is either unchanged or keeps on increasing.The dgpu market is plummeting, no denying that.NV and AMD needs to do something dramatic with their next architecture.
 
Aug 11, 2008
10,451
642
126
This is nice performance. The problem I see is cost. If they could make this igp standard on quad core mobile chips without raising the price, it would be a great advance. As it is, it is basically a niche product for Apple and super expensive ultrabooks. Maybe with broadwell they can make something with performance like this the mainstream while making a still higher performance upscale version.

Strangely, I am seeing thin/light models advertised with gt740m at micro center. Too bad they can't just use the iris pro. Must again be the cost factor.
 

joshhedge

Senior member
Nov 19, 2011
601
0
0
You should check again because Iris Pro is within 1-2 fps at 1080p with updated drivers (you say you game at 1080p, good luck with that , bf3 chunks along at 10 fps on EITHER solution). Maybe you just play weak games, relatively speaking. Let's be clear that you won't game at 1080p on either solution unless you don't mind playing at 5-15 fps. The framerate will be relatively the same with either solution.

At 1366 resolution, which is far more common for ultrabooks and non dedicated gaming laptops - the Iris Pro soundly beats the GT650M.

Besides which, it is already a known fact that macbook pro is now using Iris Pro - it will not be using discrete graphics. With this level of performance and added battery life, the trade-off was well worth it; excluding the higher battery life of Haswell, the new rMBP will immediately cut 45W TDP as compared to the 2012 model.

It's not a known fact, because there have been no leaks published which suggest so. I don't deny that Iris Pro appears to be a viable replacement for a dGPU but don't rule it out just yet. Obviously I have no doubt the 13" will be using Iris Pro.

Doesn't look like 10FPS to me, more like 47 http://uk.hardware.info/reviews/477...end-of-mid-range-gpus-battlefield-3-1920x1080

I don't really play shooters, not mentally challenging enough for me. I have perfectly acceptable frame rates, 30+, playing a majority of my games at 1080p, the only game I've had issues with is The Witcher 2. Iris Pro will not beat the GT 650M *Cough 660M* in the 15" rPro, and hence I believe Apple would be stupid to include Iris Pro, it's not as if the laptop requires more battery life.

Why would Apple downgrade?
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
232
106
Its not about being dumbed down.
In a sense, people no longer demand cutting-edge graphics and technological advances. Most people feel 'OK' playing Poker, FarmVille, Angry Birds and the like. People are 'OK' voting with their dollar for these toys. Why manufacturers should bother at all

I suppose, the times have changed and we're just thru another cycle, starting over. To hell with that, I say.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
In a sense, people no longer demand cutting-edge graphics and technological advances. Most people feel 'OK' playing Poker, FarmVille, Angry Birds and the like. People are 'OK' voting with their dollar for these toys. Why manufacturers should bother at all

I suppose, the times have changed and we're just thru another cycle, starting over. To hell with that, I say.


This will happen with CPUs too.


I just hope we will still get reasonebly priced stuff in some E-Segment for both dGPU and CPU.

HW being released before IVY-E shows that we're on a bad path for the absolute performance enthusiast segments.

(Granted software eco system's arent exactly helping out either).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
In a sense, people no longer demand cutting-edge graphics and technological advances. Most people feel 'OK' playing Poker, FarmVille, Angry Birds and the like. People are 'OK' voting with their dollar for these toys. Why manufacturers should bother at all

I suppose, the times have changed and we're just thru another cycle, starting over. To hell with that, I say.

I think you miss the point. If anything, AMD and nVidia s shipping more highend GPUs than ever. And PC gaming is rapidly expanding on all fronts. Not just lowend casual games.

However, when an IGP is fast enough as a value dGPU. You simply dont sell the dGPU. People use the IGP instead. And IGPs performance expands much faster than dGPU. Meaning it will continue to take marketshare from the dGPU segment.

Its no different than getting onboard NIC, sound, storage controller and so forth. Yet we didnt lose anything there and progress continued.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
In a sense, people no longer demand cutting-edge graphics and technological advances. Most people feel 'OK' playing Poker, FarmVille, Angry Birds and the like. People are 'OK' voting with their dollar for these toys. Why manufacturers should bother at all

I suppose, the times have changed and we're just thru another cycle, starting over. To hell with that, I say.

You're not understanding what segment Iris Pro is aimed at.

You must keep in mind that Iris Pro is designed for super small form factors with long battery life - in this respect Iris Pro is unmatched. The market has shown that consumers value portability and battery life, this is EXACTLY why ARM SOCs are a "thing" now. If anything, iGPU advances with the Iris Pro are insanely good. It's only getting better with Broadwell. Show me another 47W TDP quad core mobile CPU + GPU that can perform that well - you won't be able to.

On the other hand, you can still get a full size hunk of junk 17 pound laptop with your discrete GPU if you want. Complete with 30 minutes of battery life at 100% load. GTX780M SLI. Take your pick. I value portability and battery life and Iris Pro allows great performance and is UNMATCHED in its class of TDP range. That's what Iris Pro is all about. In fact, I see a future where intel steps up the iGPU game so much that 1080p gaming will become a thing - Broadwell will come close to that, skywell even further. Should be an interesting couple of years down the road.

I'm not saying full size laptops are bad. But they're not portable if you're gaming. They weigh a lot more, they get hot and loud just like desktops. If you want all out performance by all means, that's what you want. But you're not using it on a battery charge if you're gaming.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Ultrabook is the future of portable form factors, a fullsize laptop is not portable. Have fun with carrying a 17 pound hunk of junk around. If you want a full size gaming laptop that can't be carried, gets 1 hour of battery on a charge, and weights 17 pounds, yeah, dGPU is the way to go. Ultrabook is a different breed of machine and is the future.

Ultrabooks aren't designed specifically for gaming either, although Iris Pro is proving to perform VERY well even in the ultrabook form factor - clearly if you're a gamer and only a gamer, a laptop with a dGPU is the best purchase. Anyway, your last point is not correct since even mid range dGPUs add 40W alone to TDP. So 87W TDP versus 47w TDP (Iris Pro) + Quad mobile. Iris pro literally halves the TDP.

Bit of an exaggeration there.



Even under heavy battery life testing you can get 3 hours of use out of a monster 10 pounder.

And under gaming load tdp != power consumption. Given the inherent nature of iris and all igps, power consumption under gaming load will be around tdp levels (similar to how ULV cpus will consume tdp levels of power) as available thermal headroom is shunted toward GPU boost clocks. As in the AT review there was a gain when the tdp of the CPU was changed to 55 watts, indicating that turbo on the igp is not fully realized. Running a game on the CPU + dgpu power consumption is nothing like the two tdps added together. This is because the CPU tdp is the cores + igp tdp added together and the gpu tdp is that of the GPU under high stress loads. So under a really heavy game such as crysis 3 the cpu cores might be consuming 25-30 watts out of the 45 watt tdp with the 45 watt dgpu using around 30-35 watts.

Under furmark the 660m in my laptop consumes more than 45 watts. That's primarily because furmark increases the voltage to 1.0875 from 0.9375 (+0.15 volts) resulting in a massive increase in power consumption. In games the 660m runs at 0.9375 and generally only consumes about 30-40 watts depending on the game.

The benchmarks from a few months ago had old drivers, of which intel recently updated. The Iris Pro now beats the GT650M in most benchmarks while having HALF the TDP.

The GT650M alone is a 45W TDP part. The Iris Pro offers substantial power benefits.

Even using accelerated functions in a browser (which will use dGPU), an Iris Pro would completely destroy any dGPU setup for light usage, even browsing or media consumption in terms of battery life. Again, it's no surprise that Apple ditched discrete all together. This will only get worse in the future - AMD and nvidia will get shut out of high end ultrabooks (which is the future of form factors) and will be relegated only to full size 17 pound laptops. Impressive feat by intel IMHO.

You are completely wrong. Rarely, under light loads does the dgpu come into play. If it does and you want to stop it there is one really easy step you can do. Open nvidia control panel, set global setting to integrated then go to application profiles and set the game/ program exe files to 'high performance'. Takes a minute (unless you have a massive steam library) and lasts forever.

No, that's not true. dGPU is used during light usage for media consumption and accelerated browser functions, but it won't use 100% GPU load, obviously.

dGPU isn't "inactive" 100% of the time outside of gaming. I don't know where you get this idea. If that were true, dGPU would be useless since most ultrabooks and macbooks are NOT designed for gaming. The fact of the matter is, dGPUs are used a low clockspeeds during accelerated functions even in browsers. Chrome and firefox use your dGPU, even with optimus, during light usage. It won't match gaming GPU loads, but nonetheless - even these types of applications rely on dGPU for accelerated functions.

The notion that dGPU is completely turned off 100% of the time outside of the game is the silliest thing I have ever heard. I hope you understand that macbooks aren't used for gaming. The fact of the matter is that Iris Pro matches the GT650M while having half the TDP. Same performance, 47W TDP versus 92W TDP. I hope you see the implications of this.

Optimus does not generally select the dgpu for basic browser accelerations. Apple does do that for chrome but then that's a driver problem that apple needs to fix. And yes the dgpu is turned off 100% (though certain traces in the mobo are still active which consumes minuscule amounts of power) when not gaming under optimus unless certain conditions are met.

http://www.nvidia.ca/object/optimus_technology.html

As per the optimus whitepaper there are three primary calls that trigger optimus.

DX calls (3d game applications or DX program)
DXVA calls (video playback)
Cuda

Optimus is also quite intelligent. Play an older game and it may use the igp. Video has to be high quality to trigger dgpu usage. Set it to high performance gpu and use office and it will run on the igp. AT's heavy battery life tests include a 1080p H.264 12 Mbit/sec video playback that doesn't trigger the dgpu.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You're not understanding what segment Iris Pro is aimed at.

You must keep in mind that Iris Pro is designed for super small form factors with long battery life - in this respect Iris Pro is unmatched. The market has shown that consumers value portability and battery life, this is EXACTLY why ARM SOCs are a "thing" now. If anything, iGPU advances with the Iris Pro are insanely good. It's only getting better with Broadwell. Show me another 47W TDP quad core mobile CPU + GPU that can perform that well - you won't be able to.

On the other hand, you can still get a full size hunk of junk 17 pound laptop with your discrete GPU if you want. Complete with 30 minutes of battery life at 100% load. GTX780M SLI. Take your pick. I value portability and battery life and Iris Pro allows great performance and is UNMATCHED in its class of TDP range. That's what Iris Pro is all about. In fact, I see a future where intel steps up the iGPU game so much that 1080p gaming will become a thing - Broadwell will come close to that, skywell even further. Should be an interesting couple of years down the road.

I'm not saying full size laptops are bad. But they're not portable if you're gaming. They weigh a lot more, they get hot and loud just like desktops. If you want all out performance by all means, that's what you want. But you're not using it on a battery charge if you're gaming.

Iris Pro is 47 watt tdp. Its not designed for 'super-small', small maybe, but definitely not MBA for factor.

Yep, my i7-4800mq laptop with 780m SLI will last 30 minutes under heavy load (mainly because the GPUs will run much slower than nominal as the battery can't discharge fast enough) but seriously, unless you are looking at a large battery with very low platform power consumption iris pro isn't going to last that long either. 47 watt tdp (assuming power consumption will be similar as igp boost clocks will likely use up a lot of the headroom) + 10-15 watts for screen/ RAM/ HDD/ wifi = 57-62 watts under load, maybe 60 to 90 minutes on a large battery gaming. Probably less considering a small for factor + 47 watt cooling is not going to leave much space for a battery (doubt it will be over 76 watts for a pc notebook apple could possibly put in a larger one). Fact is pretty much no one games on battery.

Not to mention that a ULV level cpu + midrange 740/750m will consume similar amounts of power under gaming load and probably last longer (integrated PCH, no edram).

And you don't have to exaggerate with a 17 pound SLI 780m notebook. Compare it to the MSI GE 40 or Razer Blade (do them right, fix up display, etc) and the comparison changes dramatically. And the GE 40 doesn't seem to have a problem running on a 90 watt adapter so clearly the method of adding (35 watt CPU + ~50 watt GPU + 10 watt platform = 95 doesn't really work).
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
http://translate.google.de/translat...3/intel-iris-pro-5200-grafik-im-test/&act=url

That far more comprehensive review shows it in a much less favourable light. Check out the power draw. Efficiency-wise it's barely better than the desktop A10 6700.

That review is really weird. The a10-6800k should not be 20% faster than the a10-5800k, every other review has shown that the average improvement is around 3-5% (GPU clocks from 800 mhz to 844 mhz). Note that the 750m is using DDR3 ram (indicated by the 1800mhz speed= 900 mhz which is common for ddr3, if it was GDDR5 it would be 1000 or 1250).

http://www.notebookcheck.net/Review-Schenker-S413-Clevo-W740SU-Notebook.98313.0.html

Its a clevo so I'm not going to use battery numbers but clearly my hypothesis appears to be correct. HD 5200 requires a ton of power when playing games (looks like 30 watts according to HW info which is generally decently accurate). CPU boost is absent and vcore is really low. Total CPU power is right at tdp (to slightly over). Take with a grain of salt because IA core power looks really low.

Its also (average of three games at various settings) about 20% faster than a 64 bit GDDR5 740m with a ULV CPU and 20% slower than a 750m with DDR3. Metro last light tends to do quite well on intel.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
That is not the performance you would expect from 22nm ~200sqmm chip(84sqmm eDRAM + at least half of 284sqmm). Comparing it to nvidia's 28nm 112sqmm chip is somewhat unfair. Going by the die size, we should be looking at somewhere between GTX660(GK106) and GTX680(GK104) performance. So, no...performance is nowhere near good.

The eDram cache is only necessary to overcome the fact that GDDR5 is not an option for the Iris Pro.
 

rootheday3

Member
Sep 5, 2013
44
0
66
That is not the performance you would expect from 22nm ~200sqmm chip(84sqmm eDRAM + at least half of 284sqmm). Comparing it to nvidia's 28nm 112sqmm chip is somewhat unfair. Going by the die size, we should be looking at somewhere between GTX660(GK106) and GTX680(GK104) performance. So, no...performance is nowhere near good.
I would like to see power consumption tests. Sofar I didn't see any...

The 84sqmm of eDRAM isn't part of the graphics die area. That is for memory bandwidth. If you are going to count that against the Iris Pro, you have to count all the 1-2GB of DDR3/GDDR5 on the dGPU board against its area total too.

So you are back to ~130sqmm and comparison to GK107 (112sqmm) is reasonable.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The 84sqmm of eDRAM isn't part of the graphics die area. That is for memory bandwidth. If you are going to count that against the Iris Pro, you have to count all the 1-2GB of DDR3/GDDR5 on the dGPU board against its area total too.

So you are back to ~130sqmm and comparison to GK107 (112sqmm) is reasonable.

And the area of all the DDR3 DIMMs that the iris pro is using... c'mon now. Since when cache is not a part or processor...
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
And the area of all the DDR3 DIMMs that the iris pro is using... c'mon now. Since when cache is not a part or processor...

By that measurement, GDDR is cache. Since its all loaded in the first place from main memory.

There are 2 options. GDDR(everything) or eDRAM. As we also seen in consoles.

Previously AMD also used GDDR as a sideport cache.
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
You're not understanding what segment Iris Pro is aimed at.

You must keep in mind that Iris Pro is designed for super small form factors with long battery life - in this respect Iris Pro is unmatched. The market has shown that consumers value portability and battery life, this is EXACTLY why ARM SOCs are a "thing" now. If anything, iGPU advances with the Iris Pro are insanely good. It's only getting better with Broadwell. Show me another 47W TDP quad core mobile CPU + GPU that can perform that well - you won't be able to.

On the other hand, you can still get a full size hunk of junk 17 pound laptop with your discrete GPU if you want. Complete with 30 minutes of battery life at 100% load. GTX780M SLI. Take your pick. I value portability and battery life and Iris Pro allows great performance and is UNMATCHED in its class of TDP range. That's what Iris Pro is all about. In fact, I see a future where intel steps up the iGPU game so much that 1080p gaming will become a thing - Broadwell will come close to that, skywell even further. Should be an interesting couple of years down the road.

I'm not saying full size laptops are bad. But they're not portable if you're gaming. They weigh a lot more, they get hot and loud just like desktops. If you want all out performance by all means, that's what you want. But you're not using it on a battery charge if you're gaming.
Ultrabook is non-sense platform created out of free cash intel had to spend on something to just fill the business plans.
The conventional 13" laptop with iGP will be just slightly heavy than 13" ultrabook will be only slightly slower and will last abit shorter on battery than ultrabook. Yet it costs 3-4 times more.
If you want some portable computer but laptop is not good for you, you get a tablet. Ultrabook as a device doesn't solve any technical problem, being it performance, portability or battery life. It's nothing else than some trend/fashion that looks like copied mac book. I have like 4 year old Acer laptop with iGP/15" screen and I can carry it anywhere it's nowhere as heavy as you suggest laptops are. I can work on battery pleasant 2.5 hours which is enough for anyone to spend outside.

I agree that gaming laptops with paired discrete graphics cards and alot of cooling pipes and large screen are too heavy for everyday carrying, yet they are separate cathegory of computers which ultrabooks aren't intended to compete with.
 
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,879
3,230
126
haswell NUC has a HD5000 series... which i believe is the Iris.

Lol... so far none of the desktops have been slated to feature this IGP.
 
Last edited:

mavere

Member
Mar 2, 2005
187
2
81
IGPs also advance faster than dGPU. Specially when dGPU turns to 3 year node changes.

Its not about being dumbed down. Its simply about all the 50-75$ dGPUs not being sold. And soon 100$ cards too.

You can actually blame the dGPU makers, for trashing their value products to such poor performance.

Intel has the luxury of building up their efficiency-focus designs, while the others must try to break down their monolithic beasts. Intel Iris is literally the ARM of GPUs.

Now if only the prices are actually reasonable...
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
That review is really weird. The a10-6800k should not be 20% faster than the a10-5800k, every other review has shown that the average improvement is around 3-5% (GPU clocks from 800 mhz to 844 mhz).

Thats true if you use 2133MHz memory for both of them, but only A10-6800K officially supports 2133MHz memory and it has more than 3-5% difference against A10-5800K with 1866MHz memory in gaming.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |