Bulldozer prices leaked

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
What I saw was a benchmark at 640x480 LQ and that is ridiculous.

I agree. You can't really use benchmarks at that resolution to try to come up with a correlation as to what you can expect under real life circumstances.

Benchmarks done at that resolution are on the verge of "synthetic". They are merely a gauge of the relative performance between processors, not the relative performance to what you'd expect when actually playing the game.

What about Arma 2?

What about it? Those benches are a bit suspect, IMO. They show piss poor* megahertz scaling with the Phenom processor while other processors have the expected scaling when overclocked. On the other hand, these benches show that Phenom II does scale with clockspeed in Arma II. They also show the higher clocked Phenoms are starting to enter the lower clocked Lynnfield territory and are ahead of any dual core processors. So again, with enough clockspeed AMD started to compete against the previous generation. I think we'll see the same thing in the Bulldozer vs. SB battle, except for AMD's sake they need to (and I think will) be even more competitive in terms of performance and TDP.

*Ph 940, 3.0 GHz: 24.8 fps
Ph 955, 3.5 GHz: 25.5 fps (+16% GHz, +3% fps)
Q9650, 3.0 GHz: 26.7 fps
Q9770, 3.5 GHz: 29.8 fps (+16% GHz, +12% fps)
i7-920, 2.66 GHz: 28.2 fps
i7-920, 3.5 GHz: 36.9 fps (+31% GHz, +30% fps)
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
I am glad you remember Ben! That's the thing. For some of us who have been following hardware for a while, we saw the massive hype AMD can create for their newest processor and just fall flat on its face. Now, more than ever, we have so much information at hand to make a reasonable prediction.

I think people are getting caught up in the hype and don't realize just how inefficient the Phenom architecture is, both in terms of power consumption and performance per clock:



The 2600k @ 4.7ghz is still consuming 115W less at load than a Phenom II X6 is at only 4.0ghz.

And the performance advantage SB enjoys is just staggering.

So how is Bulldozer going to surpass Sandy Bridge when AMD is 2 full generations behind in performance? That sounds too optimistic to me.


I'm in no way suggesting that the Phenom2 is anywhere near as efficient as any i7 (or probably even C2D for that matter) but these numbers seem a bit fishy to me.

The 2600k has a TDP of 95 watts at it's factory 3.4GHz speed. At 4.7GHz that graphs shows an increase of 76.4 watts over the 3.4GHz power use. So if we assume that at 3.4GHz, 100% load that chip is somewhere around 95 watts, then we add 76.4 we get 171.4 watts for the 2600k @ 4.7GHz.

Now, that graph shows that the Ph2 uses a further 115 watts over that 4.7GHz 2600k, so that means the Ph2 uses about 286 watts (the system is 306 watts).

Is that even possible? That just sounds awfully high to me. I'm confident the 2600k isn't acutally using all of 95 watts @ 3.4GHz, but it's probably in the ballpark. So that may account for a few watts. But that figure for the Phenom just seems too high, but maybe I'm wrong?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm in no way suggesting that the Phenom2 is anywhere near as efficient as any i7 (or probably even C2D for that matter) but these numbers seem a bit fishy to me.

Now, that graph shows that the Ph2 uses a further 115 watts over that 4.7GHz 2600k, so that means the Ph2 uses about 286 watts (the system is 306 watts).

Is that even possible? That just sounds awfully high to me.

Here is another review by Bit-Tech (total system power):
http://www.bit-tech.net/hardware/cpus/2011/01/03/intel-sandy-bridge-review/11

At load:

Core i5 2500k @ 4.9ghz = 221 W
Core i7 2600k @ 4.85ghz = 234 W
Phenom II X6 1100T @ 4.2ghz = 369 W D:
 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
He said was "supposed" to be ground up.

Obviously some designs got canceled along the way, and what was released was an upgraded K8.

I don't remember AMD ever saying a ground up design for K10. I do remember them saying it is their next gen.

Plus your statement is correct about designs being canceled along the way. They were at least 6 months into one design after K8 and completely started over. Apparently they decided to base it much more on K8 than they probably originally intended to. But for some reason canceled it. Who knows how many designs get canceled?


Jason
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
I dont see the point comparing power usage between a 45nm CPU vs a 32nm CPU
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Ya, AMD basically has a 3 year old architecture on a 2 year old process right now. It shouldn't surprise anyone that it doesn't compete well against Intel's 1 year old process and brand new architecture.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
You realise those benchmarks havent changed in over a year right. Gtx 280 with no AA, what believable gaming settings that is too.

yup, with enough resolution, extra AA, and a video card vastly underpowered relative to the CPU, I can make any game come to an equally slow crawl regardless of CPU

fact of the matter is:
1. if your video card is fast enough not to be a bottleneck (or you simply use graphics settings that won't be a bottleneck), an Hz for Hz Intel rig will be hands down faster, and this is before any clock rate Intel advantage we're likely to see through overclocking

*note how the Intel chips with a much lower clock rate (the 2.66GHz quads and 3.06GHz dual) are keeping up with the much higher clock rate AMD (3.3GHz sixcore and 3.5GHz quad) and that the comparably clocked Intel chips (the 2500K/2600K) are 30% faster...



2. if the game is CPU dependent enough in the first place (ie Star Craft 2, World of Warcraft, Civ V), the Intel rigs are going to, again, be clearly faster than AMD ones

This current Intel advantage is also going to be common place in many games that aren't easily or fairly benchmarked (because the tests aren't precisely reproducible), particularly online mulitplayer games with large numbers of PCs/NPCs running around on screen (this is primarily why a relatively very old and graphically underwhelming game like WoW can still be so very hardware demanding).

Black Ops is another excellent example:

Sure, the Phenom II X4 970 is "good enough" as it provides above 60fps average and minimum is acceptable at 52fps. But its a good 25-32% slower despite being 833MHz faster than the Intel CPUs, which if we overclock the Intel CPUs we start to get performance acceptable for a 3D display or baby-butt smooth and competitive advantages of a fast 2D 120Hz monitor:




3. AMD doesn't even have the budget gaming niche locked down:

the four fastest sub $200 CPUs (not including Microcenter type deals where you can get the i5 2500K for $180) are all Intel, the 4th fastest CPU on that list also being the 5th cheapest out of the dozen choices (they are ordered by price). AMD's fastest quad and fastest gaming CPU on the list often gets embarrassed by humble dualcore i3s



Don't get me wrong here, I'm in no way reveling in the fact that Intel is dominating so thoroughly (I might be if I had been smart enough to buy an i7 920 rig two and a half years ago when socket 1366 first came out! ), I'm rooting as much as anyone for Bulldozer to be a success (I can only fantasize what the new architecture might be able to do for my heavily threaded workloads), I'm just trying to keep things in perspective here.



I dont see the point comparing power usage between a 45nm CPU vs a 32nm CPU

because if you're buying a CPU right now its actually very relevant?

Not that it truly matters because Intel still has the performance, watt, and performance/watt advantage even 45nm to 45nm, its just that because Intel has a year+ head start advantage in process technology we're in a period where we are seeing the pouring of even more salt into the wound, much like when Intel released 45nm Wolfdale and Yorkfield duals and quads before AMD could release even their own underwhelming 65nm Phenom 1 line.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
yup, with enough resolution, extra AA, and a video card vastly underpowered relative to the CPU, I can make any game come to an equally slow crawl regardless of CPU

fact of the matter is:
1. if your video card is fast enough not to be a bottleneck (or you simply use graphics settings that won't be a bottleneck), an Hz for Hz Intel rig will be hands down faster, and this is before any clock rate Intel advantage we're likely to see through overclocking


2. if the game is CPU dependent enough in the first place (ie Star Craft 2, World of Warcraft, Civ V), the Intel rigs are going to, again, be clearly faster than AMD ones

This current Intel advantage is also going to be common place in many games that aren't easily or fairly benchmarked (because the tests aren't precisely reproducible), particularly online mulitplayer games with large numbers of PCs/NPCs running around on screen (this is primarily why a relatively very old and graphically underwhelming game like WoW can still be so very hardware demanding).

Black Ops is another excellent example:

Sure, the Phenom II X4 970 is "good enough" as it provides above 60fps average and minimum is acceptable at 52fps. But its a good 25-32% slower despite being 833MHz faster than the Intel CPUs, which if we overclock the Intel CPUs we start to get performance acceptable for a 3D display or baby-butt smooth and competitive advantages of a fast 2D 120Hz monitor:




3. AMD doesn't even have the budget gaming niche locked down:
the four fastest sub $200 CPUs (not including Microcenter type deals where you can get the i5 2500K for $180) are all Intel, the 4th fastest CPU on that list also being the 5th cheapest out of the dozen choices (they are ordered by price). AMD's fastest quad and fastest gaming CPU on the list often gets embarrassed by humble dualcore i3s



Don't get me wrong here, I'm in no way reveling in the fact that Intel is dominating so thoroughly (I might be if I had been smart enough to buy an i7 920 rig two and a half years ago when socket 1366 first came out! ), I'm rooting as much as anyone for Bulldozer to be a success (I can only fantasize what the new architecture might be able to do for my heavily threaded workloads), I'm just trying to keep things in perspective here.

A few things.

NVIDIA GPUs are generally more CPU dependent.

Multi GPU configurations do indeed benefit greatly from a faster CPU.

I play SC2 and I'm yet to see such low performance as the benchmarks (and not only that one) indicate.

And that is why I posted in the first place - many times I see benchmarks, when comparing CPUs that seem to indicate with a certain CPU a game is unplayable (as that 640x480 LQ Witcher 2) but the truth is that even though Intel CPUs are quite faster, especially the newer SB, that is irrelevant if the game is already is playable.




Notice how the newer SB are much better than the old core i as well.
 
Last edited:

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Ya, run these benchmarks with vsync enabled. I bet the two CPUs would be much more similar. For thse of us who game with sync on, it makes sense, although obviously it's not a good quantitative measure of gaming performance.

Anyway, again, who cares? We already know phenom is old, let's see if amd actually has a good reason for announcing bd at E3.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
NVIDIA GPUs are generally more CPU dependent.
They're generally faster too Faster GPU demands a faster CPU

Multi GPU configurations do indeed benefit greatly from a faster CPU.
Which would be situations where the game or setup is GPU limited, again, the faster CPU is not only better in the-here-and-now for the specific situation, it is going to be better for the future when there are more stressful games and when there are faster single GPU video cards to pair with the CPU instead

I play SC2 and I'm yet to see such low performance as the benchmarks (and not only that one) indicate.
Do you ever actually keep track of your hardware performance while you play? Some people are simply more or less sensitive than others are when it comes to such things.

Play style can also play a significant factor, for instance your matches simply may not last long enough for there to be a meta game with large enough armies clashing together to cause any slowdown.

And that is why I posted in the first place - many times I see benchmarks, when comparing CPUs that seem to indicate with a certain CPU a game is unplayable (as that 640x480 LQ Witcher 2) but the truth is that even though Intel CPUs are quite faster, especially the newer SB, that is irrelevant if the game is already is playable.
Except there's a huge difference between "playable" and desirable - as a competitive gamer I strive for minimum frame rates no lower than 60fps, so while I may consider 60fps as merely "playable" I sure as hell desire something more along the lines of 120fps to match the refresh rate of my monitor

And CPU overkill is also relevant because, again, the faster than "playable" CPUs will not only provide more desirable performance right now, but they will be better off for when more stressful games come out.

Heck, to think more on what I've already briefly touched upon - socket 1366 i7 920 rigs have been out for two and a half years and yet they're still better poised to take on future titles than AMD's latest and fastest counterpart. So unless someone is just going to constantly upgrade to the latest and greatest parts, it is relevant to consider parts that might be overkill to better project system longevity and/or resale value. The cost of a s1366 i7 920 rig might have seemed steep two and a half years ago, but when we consider all that time it has been a creme of the crop gaming and overall PC rig it has actually turned out to be an incredible bargain and easily has a much higher resale value than a Phenom I rig.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
They're generally faster too Faster GPU demands a faster CPU
And more expensive. On the other hand they seem slower on slower CPUs while other cards will keep constant performance.


Which would be situations where the game or setup is GPU limited, again, the faster CPU is not only better in the-here-and-now for the specific situation, it is going to be better for the future when there are more stressful games and when there are faster single GPU video cards to pair with the CPU instead.

Not entirely - I've seen single cards that perform similar to older dual cards and aren't affected, like for example the 5870 vs the 4870x2.


Do you ever actually keep track of your hardware performance while you play? Some people are simply more or less sensitive than others are when it comes to such things.

Play style can also play a significant factor, for instance your matches simply may not last long enough for there to be a meta game with large enough armies clashing together to cause any slowdown.
Often I do play with fraps up and I often review replays with fraps up and play zerg so loads of units moving around.

Except there's a huge difference between "playable" and desirable - as a competitive gamer I strive for minimum frame rates no lower than 60fps, so while I may consider 60fps as merely "playable" I sure as hell desire something more along the lines of 120fps to match the refresh rate of my monitor

And CPU overkill is also relevant because, again, the faster than "playable" CPUs will not only provide more desirable performance right now, but they will be better off for when more stressful games come out.

Heck, to think more on what I've already briefly touched upon - socket 1366 i7 920 rigs have been out for two and a half years and yet they're still better poised to take on future titles than AMD's latest and fastest counterpart. So unless someone is just going to constantly upgrade to the latest and greatest parts, it is relevant to consider parts that might be overkill to better project system longevity and/or resale value. The cost of a s1366 i7 920 rig might have seemed steep two and a half years ago, but when we consider all that time it has been a creme of the crop gaming and overall PC rig it has actually turned out to be an incredible bargain and easily has a much higher resale value than a Phenom I rig.

That depends on many things.

Initial price, price of the day one buys, how often you upgrade, what games one play, if someone resells the pc or use it to upgrade a second machine, if one is able to get a phenom II X2 and unlock it, etc.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
that one for $320 looks really good looking at prices they either wanna be competitive or their 50% over i7 performance was a load of crap,or maybe they wanna aim to go right for the 2600k with that one the 2600k is technically 8 threads and since amd cant patent ht into their cpus they only offer physical cores which might or might not be faster per core but physically faster as a whole chip,amd would need a whooper of a quad core to make people happy as many say"not fair to compare 8 amd cores to 4 intels with ht" imagine if amd offered ht on some of their higher end quads,bet the market would get pretty interesting huh?lets face it no one needs 8 cores for gaming so basically the market for a 8 core is productivity where it counts just like the x6 1100t the 1090t and the 980 and 990x,intel wasnt stupid releasing that 990x months after sandy came out,people do care about pure performance over price
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
dont believe the market needs more cores if their r&d went more into faster ipc,oh yeah we would be taking off i love my i7 950 but lets face it not many apps can take advantage of it to its full potential for many its the saving grace as a gaming chip,when you dont game,it wont hardly ever see 30% load i for on just do alot of gaming and really hope games like bf3 and the upcoming ones can take advantage of multicore processing even more
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Heck, to think more on what I've already briefly touched upon - socket 1366 i7 920 rigs have been out for two and a half years and yet they're still better poised to take on future titles than AMD's latest and fastest counterpart. So unless someone is just going to constantly upgrade to the latest and greatest parts, it is relevant to consider parts that might be overkill to better project system longevity and/or resale value. The cost of a s1366 i7 920 rig might have seemed steep two and a half years ago, but when we consider all that time it has been a creme of the crop gaming and overall PC rig it has actually turned out to be an incredible bargain and easily has a much higher resale value than a Phenom I rig.

Excellent, excellent points there bunnyfubbles.

Sure you might save $100 dollars now by buying the Phenom system but then in 2 years time that CPU already too slow again, requiring another upgrade. Unlike in the past, today a good CPU can outlast 2-3 GPU upgrades. Therefore, it makes sense to get a reasonably fast $200-300 CPU and keep it for 2-3 years.

On another note, Bulldozer will need DDR3-1866 memory? I wonder how much performance will be lost as a result of using DDR3-1600. It's a bit odd for AMD to go with the more expensive DDR3-1866 option since it will only increase the platform cost and it would make it more difficult for people with more common DDR3-1333 or DDR3-1600 RAM and current AM3+ mobos to upgrade. Perhaps this market is very small so AMD decided it wasn't worth it.

i for on just do alot of gaming and really hope games like bf3 and the upcoming ones can take advantage of multicore processing even more

Without a doubt next generation PS4 and Xbox will have even more CPU cores than what they currently have. The problem is those consoles probably won't ship until 2014-2015. Therefore, in the foreseeable future I imagine that most game developers won't utilize more than 4 threads for games. I imagine that 4 core Sandy Bridge or Ivy Bridge won't be any slower than an 8 core Bulldozer for games over the next 2 years. This is why the 2500k currently makes a lot more sense over the 2600k for gamers, and why the $320 Bulldozer will still be a "tough" sell over the much cheaper and insanely overclockable/energy efficient 2500k.

The tricky thing about Bulldozer is that it's not a "true" 8 core CPU though. That's why I don't think it's unfair to compare a 4 core HT 2600k to an "8-core" Bulldozer. Both are certainly true 4-core CPUs, but they just chose a different multi-threaded approach for maximizing thread management.

The concept for Bulldozer design:


AMD didn't take that concept approach, however. Instead, they chose to integrate two cores together into a fundamental building block it calls a "Bulldozer module." - TechReport has the story, explaining this in more detail.

What Bulldozer actually ended up being:
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
8 integer execution units make it an 8 core CPU

Of course it is a true 8 core.

Did either of you read the article? :hmm:

The Bulldozer module shares portions of a traditional core—including the instruction fetch, decode, and floating-point units and L2 cache. This means in the best case scenario, a single Bulldozer module can achieve up to 80% of the performance of two complete cores of the same capability. Therefore, it's not a true 8 core CPU since a true 8-core CPU wouldn't lose at least 20% of its efficiency through sharing of resources.

There is a lot more to it than just comparing "units" or "pipelines". You have to look at the implementation of each design in detail. For instance, Bulldozer's 128-bit FMAC units will work together on 256-bit vectors, effectively producing a single 256-bit vector operation per cycle. Intel's Sandy Bridge, has two 256-bit vector units capable of producing a 256-bit multiply and a 256-bit add in a single cycle, double Bulldozer's AVX peak.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The Bulldozer module shares portions of a traditional core.

A traditional core has one Integer execution unit, later they integrated the FPU (Floating Point Unit) inside the CPU die, just like they integrated the IMC (Integrated Memory Controller).

But it is up to AMD to say how many cores each BD will have (after all they produce it) so when they say 8 cores it is an 8 core CPU (because it has 8 Integer Execution Units).

There is a lot more to it than just comparing "units" or "pipelines". You have to look at the implementation of each design in detail. For instance, Bulldozer's 128-bit FMAC units will work together on 256-bit vectors, effectively producing a single 256-bit vector operation per cycle. Intel's Sandy Bridge, has two 256-bit vector units capable of producing a 256-bit multiply and a 256-bit add in a single cycle, double Bulldozer's AVX peak.

BD has dual 128-bit FMACs (Fused Multiply Accumulate) that each can do ADD+MUL x2 in one cycle and one FMAC at 256-bit AVX, so one BD AVX can do what Intels MUL + ADD ports do.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
The Bulldozer module shares portions of a traditional core—including the instruction fetch, decode, and floating-point units and L2 cache. This means in the best case scenario, a single Bulldozer module can achieve up to 80% of the performance of two complete cores of the same capability. Therefore, it's not a true 8 core CPU since a true 8-core CPU wouldn't lose at least 20% of its efficiency through sharing of resources.

It is a trade.

What you also need to consider is the cost, die size and power consumption of that non-shared architecture versus the shared one.

Additional 80% is an average, not best case or worst case.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
i would be happy if a single 4 threaded single amd quad core can offer competition to a intel quad core with ht,games dont need 8 cores and i doubt a higher ipc in bulldozer will make that huge of a difference at higher resolutions with a higher end video card,i dont even think this will cater to gamers that well and perhaps more for business since zero games will take advantage of 6 and 8 core processors i wouldnt care if amd had a 20 core processor for $320 if the intel quad core was faster in single threaded apps ill buy that for the same price point,im not interested in tons of cores
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
and by the time a game takes advantage of 8 cores anyways no one is gonna want the processor cause we are gonna have a 10 core or a more matured version
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
i would be happy if a single 4 threaded single amd quad core can offer competition to a intel quad core with ht,games dont need 8 cores and i doubt a higher ipc in bulldozer will make that huge of a difference at higher resolutions with a higher end video card,i dont even think this will cater to gamers that well and perhaps more for business since zero games will take advantage of 6 and 8 core processors i wouldnt care if amd had a 20 core processor for $320 if the intel quad core was faster in single threaded apps ill buy that for the same price point,im not interested in tons of cores

Why would you want a quad core for a single threaded app anyway?
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
The tricky thing about Bulldozer is that it's not a "true" 8 core CPU though.

Of course it is a true 8 core.

The problem is in the word "true"

it would be like arguing over whether or not the Pentium 4 was the first "true" 2GHz or 3GHz CPU given its radically different pipeline design and lower IPC wouldn't provide 2 or 3GHz performance relative to architectures we had been used to with the Pentium 3 and Athlon.

what we ultimately will need to focus on is the real world performance

I'm pretty sure I understand why I see people like RussianSensation make comments along the lines of Bulldozer not being "true" 8 core, and it largely seems to be stemmed from overcompensation against those who are ignorant of the architecture change who make comments along the lines of "ZOMG!! 3.8-4.2GHz ON 8 CORES, AMD CANT LOSE!!!"

When the sobering reality is that the other specs suggest a $320 price point and a 125W TDP which don't add up to a Sandy Bridge slayer considering that price point matches the 2600K (why not price it higher if its truly a faster/better product?) and the TDP exceeds it (2600K is only 95W stock). The best case guess from that information is that AMD's newest/best 8 core CPU will only be slightly better, albeit not hands down better, than Intel's newest/best 4 core.

Although I hope I'm wrong, I just won't allow myself to get over-hyped until I know the official specs and can seem some actual performance results.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |