Are 4C/4T quads obsolete for gaming?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

eddman

Senior member
Dec 28, 2010
239
87
101
Buying the fastest that fits the budget? I thought this is the norm, so no need to make a special case.
... because some people think that a 1600/1600X with an OC would be as fast or faster than an 8400. That's not the case as witnessed in computerbase's tests and the ENTIRE point of this discussion. Read the earlier posts.

Fastest? A very personal decision. Cores, ST, MT.
This is specifically about performance in games, nothing else.
 
Reactions: ozzy702 and DooKey

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
For those still struggling with how benchmarks work:-

CPU Test = You minimize being bottlenecked by using the fastest GPU you can find and then lower the resolution if that's not enough, to load the (CPU) cores before the GPU hits 100% usage. Anything else just results in different CPU's idling / downclocking from Turbo waiting on the GPU's at which point you're not properly testing the CPU at full load. Lowering the resolution gives a better idea of how a CPU that's bottlenecked by a GPU today will perform in the future on a stronger GPU. The reason why no-one tests 1440p / 4K on CPU reviews is the same reason why no-one tests 5GHz i7's on GTX 1030's.

GPU Test = You minimize being bottlenecked by using the fastest CPU you can find and then raise the resolution if that's not enough, to load the (GPU) cores before the CPU hits 100% usage. Anything else just results in different GPU's idling / downclocking from Turbo waiting on the CPU's at which point you're not properly testing the GPU at full load. Raising the resolution gives a better idea of how a GPU that's bottlenecked by a CPU today will perform in the future on a stronger CPU. Again, it's the same reason why no-one reviews GTX 1080 Ti's on Celeron's.

Real World tests (eg, "But I play at 1440p / 4K / with 60fps VSync on and only want to see how that limitation affects CPU/GPU balance so I don't overbuy"), etc, is an entirely fair personal metric by itself to gauge the best average CPU/GPU pairing for a given tier of hardware / budget / constraints. But in arguments with other people of different brands, it often gets abused into cherry picking when bottlenecked low fps numbers get discarded one minute when one's favored "team" is behind, after previously being included when it was "the other way around". That's not aimed at anyone here personally, but is a highly visible and tedious trait amongst brand fanboys who seem to populate Youtube comments section and certain other clickbaity "rumor" sites...

In short, 4K / 1440P gamers should still take note of the 720p benchmarks as they basically show how much headroom your CPU has after your next GPU upgrade (ie, longer lifespan between upgrades). The real-world ones like this recent one from Techspot, are there to primarily help lower requirements gamers avoid huge mismatches. And even those are only a rough guide as people can and do play on Med/High vs Ultra which reduces GPU bottleneck (often up into the next tier, eg, in Witcher 3, 1050Ti Med = 1060 Ultra) which then makes even a budget CPU's limitations more pronounced.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,787
4,771
136
... because some people think that a 1600/1600X with an OC would be as fast or faster than an 8400. That's not the case as witnessed in computerbase's tests and the ENTIRE point of this discussion. Read the earlier posts.


This is specifically about performance in games, nothing else.
Do you even consider mins as a valid or even most important value for some? Some people prefer a smoother experience than just the best avg. 12 Threads can give a better experience depending on what the person wants, so don't assume. You are imposing your values on everyone.

I really don't understand the amazing ability of "enthusiasts" to prevent choice, as in, my way is the only way.
 
Reactions: Markfw

eddman

Senior member
Dec 28, 2010
239
87
101
Do you even consider mins as a valid or even most important value for some? Some people prefer a smoother experience than just the best avg. 12 Threads can give a better experience depending on what the person wants, so don't assume. You are imposing your values on everyone.

I really don't understand the amazing ability of "enthusiasts" to prevent choice, as in, my way is the only way.
First of all, I'm running a core 2 quad still. I suppose that makes me an "enthusiast".

Second, I don't have graphs but 99th percentile for 8400 is 10% higher than 1600X in BF1 multiplayer at 1080. So there's that.

Third, where did I impose that you should buy A or B? I'm simply stating that 8400 performs better, and is clearly shown in tests. No assumptions here. No "my way is the only way". You can buy whatever you want.

It seems that stating objective figures somehow is not accepted anymore.

P.S. If you cared to look, this sub-discussion (not the thread) is about games only. Obviously for productivity 1600 is better thanks to its SMT.
 
Last edited:
Reactions: ozzy702

arandomguy

Senior member
Sep 3, 2013
556
183
116
In short, 4K / 1440P gamers should still take note of the 720p benchmarks as they basically show how much headroom your CPU has after your next GPU upgrade (ie, longer lifespan between upgrades). The real-world ones like this recent one from Techspot, are there to primarily help lower requirements gamers avoid huge mismatches. And even those are only a rough guide as people can and do play on Med/High vs Ultra which reduces GPU bottleneck (often up into the next tier, eg, in Witcher 3, 1050Ti Med = 1060 Ultra) which then makes even a budget CPU's limitations more pronounced.

One other thing that doesn't get talked about much is that almost all these benchmarks you see online are rather short run and scene specific but the demands a game places on hardware, especially sub components, can be very different scene to scene. Why is this important? When people use these numbers to make the good enough argument this can actually ignore how there a large swathes of the game in which very functional differences can be highlighted.
 
Reactions: BSim500 and whm1974

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
You misunderstand. If I have to choose a CPU for gaming RIGHT NOW for a certain budget, I'd look for the fastest one that fits, so that I won't have to upgrade as often.

Why didn't they upgrade? Because spending so much money for a new MB+CPU+RAM combo is not worth gaining 20% performance to a lot of people.
Only a particular type of people upgrade their CPUs often. The rest, like the vast swathes of people with overclocked Sandy Bridges, don't feel the need to do so. Because an overclocked i7 2600K or 3770K is like 15% slower than a stock 7700K in GTA V with a GTX 1080. While at stock vs stock it would be less than half as fast as a 7700K if I look at 720p numbers in a game like Starcraft 2.

So by the 720p metric, a SB/IB i7 is completely trash, but looking at the GTA V results (with an achievable overclock, which is a reasonable assumption regarding how those people are running those CPUs) the picture is completely different.

The 'future GPU being handicapped by slower CPU theory because 720p gaming says so' advocates miss the point how poorly 720p results are reflected in the real way how people play games.
 
Reactions: whm1974

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
OK I've been following this thread and even post some commits here and there. So my question is: Are 4C/4T CPUs obsolete for gaming right now in 2017 and for the next three to four years?

I have a Haswell i5-4670 build and I have no plans on replacing it anytime soon as I don't have the money and my system is good enough for needs anyway. Not sure this matters, but I game only on Linux and a few older games using Wine.

At the moment I am planning on building a new rig in 2020. As that will give me the most value out of the money I spent on my current build.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
OK I've been following this thread and even post some commits here and there. So my question is: Are 4C/4T CPUs obsolete for gaming right now in 2017 and for the next three to four years?

I have a Haswell i5-4670 build and I have no plans on replacing it anytime soon as I don't have the money and my system is good enough for needs anyway. Not sure this matters, but I game only on Linux and a few older games using Wine.

At the moment I am planning on building a new rig in 2020. As that will give me the most value out of the money I spent on my current build.
I wouldn't worry considering your use case, but if you were using a GTX 1070/1080/Ti today for >60FPS gaming in the latest titles, you may find your CPU lacking in some cases in trying to maintain those FPS targets.

Sandy/Ivy Bridge i5s are definitely seeing numbered days if people are aiming at >60 FPS gaming with those CPUs.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
OK I've been following this thread and even post some commits here and there. So my question is: Are 4C/4T CPUs obsolete for gaming right now in 2017 and for the next three to four years?

I have a Haswell i5-4670 build and I have no plans on replacing it anytime soon as I don't have the money and my system is good enough for needs anyway. Not sure this matters, but I game only on Linux and a few older games using Wine.

At the moment I am planning on building a new rig in 2020. As that will give me the most value out of the money I spent on my current build.

I know you probably won't find any more gaming benchmarks showing Haswell i5s, but you can use the i5 7400/7500 or i3 8100 as a rough equivalent to your i5 4670. For the next year or two you should be fine. The new i3s are 4C/4T so developers won't make games unplayable on these chips.

Will it still run games adequately into 2020? Thats harder to say.. It would probably still meet the 'minimum requirements' of most games, unless developers by then make games unplayable (or not even run) with 4C/4T CPU like they started doing in 2015 onwards with 2C/2T dual cores. Its a possibility IMO, but of course none of us can predict the future.
 

Crono

Lifer
Aug 8, 2001
23,720
1,501
136
OK I've been following this thread and even post some commits here and there. So my question is: Are 4C/4T CPUs obsolete for gaming right now in 2017 and for the next three to four years?

I have a Haswell i5-4670 build and I have no plans on replacing it anytime soon as I don't have the money and my system is good enough for needs anyway. Not sure this matters, but I game only on Linux and a few older games using Wine.

At the moment I am planning on building a new rig in 2020. As that will give me the most value out of the money I spent on my current build.

"Obsolete" means different things to different people. If your goal is to game at high refresh, 1080p, playing the most demanding modern titles on high settings, then the answer is "maybe". Or, more accurately, it's not the ideal processor.

The average gamer (if such a being exists ) doesn't mind running at lower settings and likely isn't using a high or variable refresh display yet. For that person a quad core CPU is not obsolete... yet. In 3-4 years time we might see the majority of gamers on 6+ cores, at least on desktop.

The bottom line is if you have the luxury to worry about futureproofing your gaming PC, you really would be better off with a newer, higher core/thread CPU from either AMD or Intel. Ryzen and Coffee Lake are both appealing right now, and have good value options.

But if your budget doesn't allow for an upgrade or new build, don't bother worrying. There's nothing wrong with making do with what you have, and upgrade when you can.
 
Last edited:
Reactions: Insert_Nickname

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Well I don't think I would see enough increase in performance to be worthwhile to upgrade to a Coffee Lake i5 or i7 to warrant upgrading right since that would cost ~$400 or more for the CPU, motherboard, and memory I would need.
 
Reactions: Crono

Ranulf

Platinum Member
Jul 18, 2001
2,407
1,305
136
While at stock vs stock it would be less than half as fast as a 7700K if I look at 720p numbers in a game like Starcraft 2.

So by the 720p metric, a SB/IB i7 is completely trash, but looking at the GTA V results (with an achievable overclock, which is a reasonable assumption regarding how those people are running those CPUs) the picture is completely different.

Ahh, good ol SC2, breaking cpus now for 7 years. The only people who should pay attention to benchmarks with it are pro or amateur sc2 gamers who want to run it at 144hz (or otherwise), at which point the only relevent advice is to buy the cpu with the fastest single core speed possible and make sure its intel because blizzard seems to favor them since forever.

Now that I think about it, testers/reviewers probably like it because like most blizzard games it is easy to swap between drives or swapping a drive between systems aka no reinstalls.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
To be fair, bringing up techpowerup benchmarks is worst idea.
I do have R7 1700 3,8GHz, 3200 MHz DDR4. In BF1 at 720p ultra I do not get bottlenecked by R7. I was quite surprised when I saw these numbers, did not match any reviews. I couldn't believe it.

I do have RX 480. No when I do screen scale to 50% I do get cpu bottlenecked it its really rare to see drops below 110-130FPS with AMD GPU and 8GB of DDR4 in DX11 - ULTRA. CPU usage is sometimes insanely high 60-70% depends on map we are playing. ( I believe that i7 8700K wont drop below 130-140fps in same scenario).

To say that Ryzen is not capable of 144Hz gaming is pure b*llshit. Ryzen is fast, very fast and yes i7 will beat it. But there is a thing where you play SP, empty map or 64p. As a casual gamer I love battles, big battles (I hate, if frame drop occurs during the battle). I want 128 player map as soon as possible.

Most intensive are 64 operations. I did see i5 in 64p operations, barely capable producing 60-70 fps (GTX card DX11) with worst frametime and dropping below 45-50fps - playable, but no enjoy.


What people is saying about SMT (R5 1600X and i5 8400) is exactly what we were talking about 5 years ago with i5 and i7. Your money, your choice.

Personally I would definitely go with AMD, because Intel is messing around to much. AM4 is great socket and will be supported for 3 generations.

Who will upgrade on 4 year old MB, why not buying new system (wait for DDR5 or something)?
This is the question and always it is good to have more choices. AMD might give you HBM CPU, HBM APU in 2020-2021 for AM4. Imagine fast 8C with 16-32GB of fast and low latency HBM CPU or APU? If people would have choice to upgrade on Z170 (i5-i7) to i7 8700K would have been great deal.

And please do not tell me those bullshit stories, about pin layout, how Intel needs more power for CPU. It could be done by software.

For causal gamer who likes to enjoy gaming new games, do not buy CPU with 4 threads, buy R5 1600 or i5 8400. Totally worth it.


PS: All those people who say that SMT/HT is bad for gaming, please let them try to disable it on i3/pentium g4560. Its all about optimization and sometimes to many threads can hurt.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
To be fair, bringing up techpowerup benchmarks is worst idea.
I do have R7 1700 3,8GHz, 3200 MHz DDR4. In BF1 at 720p ultra I do not get bottlenecked by R7. I was quite surprised when I saw these numbers, did not match any reviews. I couldn't believe it.

I do have RX 480. No when I do screen scale to 50% I do get cpu bottlenecked it its really rare to see drops below 110-130FPS with AMD GPU and 8GB of DDR4 in DX11 - ULTRA. CPU usage is sometimes insanely high 60-70% depends on map we are playing. ( I believe that i7 8700K wont drop below 130-140fps in same scenario).

To say that Ryzen is not capable of 144Hz gaming is pure b*llshit. Ryzen is fast, very fast and yes i7 will beat it. But there is a thing where you play SP, empty map or 64p. As a casual gamer I love battles, big battles (I hate, if frame drop occurs during the battle). I want 128 player map as soon as possible.

Most intensive are 64 operations. I did see i5 in 64p operations, barely capable producing 60-70 fps (GTX card DX11) with worst frametime and dropping below 45-50fps - playable, but no enjoy.


What people is saying about SMT (R5 1600X and i5 8400) is exactly what we were talking about 5 years ago with i5 and i7. Your money, your choice.

Personally I would definitely go with AMD, because Intel is messing around to much. AM4 is great socket and will be supported for 3 generations.

Who will upgrade on 4 year old MB, why not buying new system (wait for DDR5 or something)?
This is the question and always it is good to have more choices. AMD might give you HBM CPU, HBM APU in 2020-2021 for AM4. Imagine fast 8C with 16-32GB of fast and low latency HBM CPU or APU? If people would have choice to upgrade on Z170 (i5-i7) to i7 8700K would have been great deal.

And please do not tell me those bullshit stories, about pin layout, how Intel needs more power for CPU. It could be done by software.

For causal gamer who likes to enjoy gaming new games, do not buy CPU with 4 threads, buy R5 1600 or i5 8400. Totally worth it.


PS: All those people who say that SMT/HT is bad for gaming, please let them try to disable it on i3/pentium g4560. Its all about optimization and sometimes to many threads can hurt.

Battlefield 1 MP is not the *only* game on this planet, and its MP performance has already been posted by Eddman earlier in the thread. No need to rehash the same argument over and over.

No one said Ryzen is incapable of 144Hz gaming. Not sure how you came to that conclusion. Very fast? That's debateable. Its all relative to the competition, and the competition is generally faster. But I'm not going to start an AMD vs Intel shitfest, this thread isn't for that.

If you want to go with AMD, go AMD. Its a free world. But the performance data is there for all to see. Its up to you, or anyone else, to make sense of that data and make the best choices based on your usage scenario.

I would agree with you that going forward, a CPU with >4 threads will generally perform better at games, but what about budget buyers? Are you going to tell someone who can only afford an i3 8100 that their CPU is shit and they can't 'enjoy' new games?
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
4C/4T isn't quite obsolete for gaming yet but it's getting there.

I'd say for a new build, Ryzen 5 1400 is the bare minimum.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
Could you stop with this nonsense graphs from TPU?
It struggles in BF, CoD, MAFIA, WATCHDOGS 2,...
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
OK I've been following this thread and even post some commits here and there. So my question is: Are 4C/4T CPUs obsolete for gaming right now in 2017 and for the next three to four years?

I have a Haswell i5-4670 build and I have no plans on replacing it anytime soon as I don't have the money and my system is good enough for needs anyway. Not sure this matters, but I game only on Linux and a few older games using Wine.

At the moment I am planning on building a new rig in 2020. As that will give me the most value out of the money I spent on my current build.

Won't you be the best judge of that? When you're CPU isn't fast enough for some game that important to you it'll be noticeable, and at that point it means your CPU is not enough anymore.

My general advice is if you look at opinions on forums to take it all in with huge reservations. Really a lot of these hardware debates on forums are rather academic and often driven by people with certain agendas. They won't be playing the games you are or looking for the same usage. Even a step further you have to wonder how many have "experience" in the situation only via benchmarks they scour the internet for.
 
Reactions: whm1974

eddman

Senior member
Dec 28, 2010
239
87
101
Only a particular type of people upgrade their CPUs often. The rest, like the vast swathes of people with overclocked Sandy Bridges, don't feel the need to do so. Because an overclocked i7 2600K or 3770K is like 15% slower than a stock 7700K in GTA V with a GTX 1080. While at stock vs stock it would be less than half as fast as a 7700K if I look at 720p numbers in a game like Starcraft 2.

So by the 720p metric, a SB/IB i7 is completely trash, but looking at the GTA V results (with an achievable overclock, which is a reasonable assumption regarding how those people are running those CPUs) the picture is completely different.

The 'future GPU being handicapped by slower CPU theory because 720p gaming says so' advocates miss the point how poorly 720p results are reflected in the real way how people play games.
What you are saying has ZERO to do with this discussion. I never said future GPUs would be "handicapped", or older CPUs would be "trash". Stop putting words in my mouth.

I'm merely saying that for a certain budget for gaming, one would want to get the fastest CPU in order to get the maximum performance out of their current and future GPUs. Is it really that hard to understand?!

Why would I buy a slower (in games) CPU when I can get a 20% faster one for about the same money? Is it clear now?

Where did I write people upgrade their CPUs often? I wrote the exact opposite, about not upgrading the CPU as often. Yes, not many people upgrade their CPUs as often, and they can extend this period by getting the fastest CPU for their money at the time; which I already wrote about.

P.S. For the record, I do NOT mean to say that 1600/1600X are slow. They are not. This is not about intel vs. AMD.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Could you stop with this nonsense graphs from TPU?
It struggles in BF, CoD, MAFIA, WATCHDOGS 2,...

How is it nonsense? Because it doesn't agree with your agenda?
Which COD are you referring to where the 8350K struggles? Do you have anything to back up your statement, or should we just take your word at face value and ignore all the data presented, which shows the 8350K, a 4C/4T CPU, still performing well in the majority of games.

As for Watchdogs 2:

 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
I would say that a 4 core CPU is fine, but it should have hyperthreading. In other words, I would recommend 8 Threads minimum, and I would say that 4 core 4 thread CPUs are on the low end and if you are going to get a 4 core CPU, it should be a mainstream i7, or a ryzen with 8 threads.
So you're saying that Ryzen 1400 is a better choice than i3-8100? Is it worth $50 more?
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
How is it nonsense? Because it doesn't agree with your agenda?
Which COD are you referring to where the 8350K struggles? Do you have anything to back up your statement, or should we just take your word at face value and ignore all the data presented, which shows the 8350K, a 4C/4T CPU, still performing well in the majority of games.

As for Watchdogs 2:


Ryzen 1600 is 30% faster than 1600x , yep It's logic!
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
What you are saying has ZERO to do with this discussion. I never said future GPUs would be "handicapped", or older CPUs would be "trash". Stop putting words in my mouth.

I'm merely saying that for a certain budget for gaming, one would want to get the fastest CPU in order to get the maximum performance out of their current and future GPUs. Is it really that hard to understand?!

Why would I buy a slower (in games) CPU when I can get a 20% faster one for about the same money? Is it clear now?

Where did I write people upgrade their CPUs often? I wrote the exact opposite, about not upgrading the CPU as often. Yes, not many people upgrade their CPUs as often, and they can extend this period by getting the fastest CPU for their money at the time; which I already wrote about.

P.S. For the record, I do NOT mean to say that 1600/1600X are slow. They are not. This is not about intel vs. AMD.
I'm not arguing with you in terms of AMD(1600/X) vs Intel(i5 8400/8600K). My point is clear -

Prove that 720p CPU gaming benchmarks are indicative of how well a CPU can keep up with a faster GPU in the future. Your entire argument is based on the usefulness of 720p gaming benchmarks. You can test this right now.

Take a 2600K and 7700K, OC them to their max capability without fancy delid or other tricks, stick a GTX 1050Ti and note the performance delta at 720p in GTA V. Say this value is X%.

Now do the same with a GTX 1080Ti but this time at 1080p. Unless you can show that the performance delta is around the same ballpark X%, then this entire premise of yours is junk.
 

eddman

Senior member
Dec 28, 2010
239
87
101
I'm not arguing with you in terms of AMD(1600/X) vs Intel(i5 8400/8600K). My point is clear -

Prove that 720p CPU gaming benchmarks are indicative of how well a CPU can keep up with a faster GPU in the future. Your entire argument is based on the usefulness of 720p gaming benchmarks. You can test this right now.

Take a 2600K and 7700K, OC them to their max capability without fancy delid or other tricks, stick a GTX 1050Ti and note the performance delta at 720p in GTA V. Say this value is X%.

Now do the same with a GTX 1080Ti but this time at 1080p. Unless you can show that the performance delta is around the same ballpark X%, then this entire premise of yours is junk.
So just because the delta in the future might not be as large as the delta now in 720 tests, I should buy the slower CPU in games for the same money?! Do you really think that's logical?! Since when lower performance is better?

I never said the difference you see now would EXACTLY be the same in the future, but there IS a clear difference and will be. A slower CPU will be slower in the future too.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |