[HardwareHaven] New Review shows FX-8150 beating i7 2600k at gaming

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
Not a bad CPU actually. Certainly has it's strengths depending on workload and review site, and will only get stronger in the near and long term. Of course, as expected the intel shills, puppets and investors are hard at work trying to convince folks why they shouldn't like it.

Couple more here. It's not anywhere near the failure certain folks are trying to attach to it.

http://www.techspot.com/review/452-amd-bulldozer-fx-cpus/

http://www.legitreviews.com/article/1741/1/

http://www.rage3d.com/reviews/cpu/amd_fx_8150/

Uh...review thread is a sticky, check it out :thumbsup:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I cant understand why you refuse to acknowledge the uselessness of benching a CPU at high res with full settings, it proves nothing.

When 28nm cards hit next year people with a bulldozer may well find it bottlenecking their new card and therefore not getting as high an FPS as those with sandy bridge setups. That is what the linked benches prove, bulldozer is weak in gaming, CPU's with a low IPC fare poorly in gaming. Fact.
What you don't seem to clue into is that GAMERS play at 1080 and up with powerful PCs and with as maxed settings as they can get away with.

FACT: i bench with GTX 580 SLI and HD 6970 TriFire-X3. And if that doesn't get bottlenecked, next year won't either.

FACT: There is no game you can point to that SB is playable at high resolutions and maxed details that BD is not (aside from driver issues)

Probably the most difficult issue for AMD is convincing people to upgrade from a fast quad Phenom II - they will be real bargains for gamers.
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
What you don't seem to clue into is that GAMERS play at 1080 and up with powerful PCs and with as maxed settings as they can get away with.

FACT: i bench with GTX 580 SLI and HD 6970 TriFire-X3. And if that doesn't get bottlenecked, next year won't either.

FACT: There is no game you can point to that SB is playable at high resolutions and maxed details that BD is not (aside from driver issues)

Probably the most difficult issue for AMD is convincing people to upgrade from a fast quad Phenom II - they will be real bargains for gamers.

FACT: It is still useless for determining which CPU is the best for gaming. In your benchmarks a dual core would probably show the exact same results. All you are doing is benching GPU's, nothing more.
 

Elixer

Lifer
May 7, 2002
10,376
762
126
They do. However, the ASUS AM3+ MB they sent to me was DoA and i had to wait two days for a replacement
:'(

And there is suspect CrossFire scaling with the FX-8150 that is not there with the Phenom II 980BE in the same PC, so i can't rule out the MB having some issues

On top of that, i couldn't get 4.5GHz stable on air (with a Thermalright UltraExtreme 120 that gets my Phenom 980BE to 4.3GHz)

Any chance we can see some benchmarks in linux ?
Like from here http://www.phoronix-test-suite.com/ ?
If you don't want to try linux, that is ok (but I have yet to see a good review of BD on linux), since they support these platforms as well: Runs On Linux, OpenSolaris, Mac OS X, Windows 7, & BSD Operating Systems
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
FACT: It is still useless for determining which CPU is the best for gaming. In your benchmarks a dual core would probably show the exact same results. All you are doing is benching GPU's, nothing more.
i really don't think you get it at all. Why not bench at 640x480 if you want to just show which CPU can play a game at 1000fps vs 1200fps


What i am doing is showing the relative performance of 3 CPU platforms using the same video cards - and in two cases, the only difference is the CPU. However, i am benching at the resolutions that gamers play at - 1080p and above and i use sufficient powerful graphics that DOES show the relative performance differences of the CPUs.


You also need to realize that these reviews were all RUSHED. Part two will use much more powerful graphics - GTX 580 SLI/HD 6970-X3 to shift the burden to the CPU and even higher resolutions. Then we will have a pattern of how each CPU handles the load.

Any chance we can see some benchmarks in linux ?
Not from me. i am way behind in benching. However, i am trying to get another reviewer who specializes in Linux to write for ABT.
 
Last edited:

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
i really don't think you get it at all. Why not bench at 640x480 if you want to just show which CPU can play a game at 1000fps vs 1200fps


What i am doing is showing the relative performance of 3 CPU platforms using the same video cards - and in two cases, the only difference is the CPU. However, i am benching at the resolutions that gamers play at - 1080p and above and i use sufficient powerful graphics that DOES show the relative performance differences of the CPUs.

Even with the best cards today games can still be GPU bottlenecked. Running at 640x480 would show the CPU's strengths far more than what you are doing.

I dont care either way, do what you want, but you cannot claim bulldozer is a good gaming CPU, bulldozer is terrible for gaming. Gamers should stay far far away from bulldozer for the time being.

Rushed or not, the reviews all show accurate relevant benchmarks that tell the same story... bulldozer is slightly worse than PII in games. Time will not change this. Piledriver might.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Even with the best cards today games can still be GPU bottlenecked. Running at 640x480 would show the CPU's strengths far more than what you are doing.

I dont care either way, do what you want, but you cannot claim bulldozer is a good gaming CPU, bulldozer is terrible for gaming. Gamers should stay far far away from bulldozer for the time being.
You haven't shown that FX-8150 is terrible at gaming. It is faster at stock than my LGA-1366 core i7 at 3.8GHz. And i7 at 3.8 GHz is not terrible at gaming. Phenom II at 4.3GHz is certainly not "terrible" at gaming.

Running my benches at 1080p and up shows PRACTICAL results of playing with a certain CPU. Running at 640x480 is like a synthetic test - useless.

AND that *is* the point. Games are GPU-bottlenecked and it is the RARE game that benefits by SB at any NORMAL gaming resolution and high details
:whiste:
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
i really don't think you get it at all. Why not bench at 640x480 if you want to just show which CPU can play a game at 1000fps vs 1200fps


What i am doing is showing the relative performance of 3 CPU platforms using the same video cards - and in two cases, the only difference is the CPU. However, i am benching at the resolutions that gamers play at - 1080p and above and i use sufficient powerful graphics that DOES show the relative performance differences of the CPUs.


You also need to realize that these reviews were all RUSHED. Part two will use much more powerful graphics - GTX 580 SLI/HD 6970-X3 to shift the burden to the CPU and even higher resolutions. Then we will have a pattern of how each CPU handles the load.


Not from me. i am way behind in benching. However, i am trying to get another reviewer who specializes in Linux to write for ABT.

I appreciate your charts, and look forward to the 580 SLI etc. I like how many games/benchmarks you are running, thanks for that.

I think what max is saying is that given two choices within a price band (AMD 8150 versus 2500K), if both are GPU limited at 1080p+ with high AA (which they are), how does one, with gaming as a primary need, decide which is the better value? I would say:

a) More powerful in gaming - tested by removing GPU bottleneck (lower resolutions) - somewhat useful as a predictor of future gaming performance when faster GPUs come out
b) Performance/watt and power use/heat considerations
c) Overclocking (if applicable to user)

Based on these, it's really hard to choose Bulldozer over the 2500K, especially since the 2500K is significantly cheaper on sale. For a gaming rig, I do not understand why anyone would consider buying Bulldozer unless they were upgrading from some really old/slow Phenom II but somehow already had an AM3/AM3+ mobo.

Building a new system as a gamer it is a no-brainer to go the Intel route.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think what xbit does is a good compromise. they take a game and put on highest settings without AA and test them a modest resolution with a fast single gpu. high settings are always needed since many of those settings stress the cpu. 1680x1050 is a fairly common yet modest resolution so it does give a very good idea of how cpus stack up when not extremely gpu limited.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I appreciate your charts, and look forward to the 580 SLI etc. I like how many games/benchmarks you are running, thanks for that.

I think what max is saying is that given two choices within a price band (AMD 8150 versus 2500K), if both are GPU limited at 1080p+ with high AA (which they are), how does one, with gaming as a primary need, decide which is the better value? I would say:

a) More powerful in gaming - tested by removing GPU bottleneck (lower resolutions) - somewhat useful as a predictor of future gaming performance when faster GPUs come out
b) Performance/watt and power use/heat considerations
c) Overclocking (if applicable to user)

Based on these, it's really hard to choose Bulldozer over the 2500K, especially since the 2500K is significantly cheaper on sale. For a gaming rig, I do not understand why anyone would consider buying Bulldozer unless they were upgrading from some really old/slow Phenom II but somehow already had an AM3/AM3+ mobo.

Building a new system as a gamer it is a no-brainer to go the Intel route.
Thank-you. It is my belief that if a reviewer uses a minimum of 20 new games and tests them at gaming resolutions and high detail using a variety of video cards, there is a definite pattern that i have been able to show over the past three years as to which CPUs are more powerful or not.

i am agreed on pricing. But you have to realize that FX-8150 is LIST price. As soon as market pricing takes hold, the relationship to SB will normalize and i think FX will not be much more expensive than Phenom II.

AMD *had* to bring out BD. Phenom II is at a dead end. The mistake they made was their PR and expectations. The failure on the HW end was to launch at 4.2GHz as a *base* clock.

This is very much like the HD 2900XT launch where AMD positioned it against Nvidia's midrange. Look at the progress they made in 5 years on the graphics side.
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
You haven't shown that FX-8150 is terrible at gaming. It is faster at stock than my LGA-1366 core i7 at 3.8GHz. And i7 at 3.8 GHz is not terrible at gaming. Phenom II at 4.3GHz is certainly not "terrible" at gaming.

Running my benches at 1080p and up shows PRACTICAL results of playing with a certain CPU. Running at 640x480 is like a synthetic test - useless.

AND that *is* the point. Games are GPU-bottlenecked and it is the RARE game that benefits by SB at any NORMAL gaming resolution and high details
:whiste:

I did show it in the reviews i linked. Its slightly worse than phenom II. For being a brand new architecture with 2 billion transistors and a power hog being nearly as good as its predecessor which was never particularly strong at gaming counts as terrible.

Next year and the year after that people with sandy bridges will be okay, those with bulldozer will have upgraded by then due to necessity because it is weaker in gaming and the low res low settings tests show this. Running tests at standard 1080p proves nothing to gamers, throw in an athlon II dual core with those GPUs at the same settings, you will get the same results showing nothing about which CPU is more powerful when it comes to gaming.

This is the same situation we had years ago with phenom I and C2Q, today almost noone has a phenom I because it was a dog, many gamers still hang tough with their C2Q though.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I did show it in the reviews i linked. Its slightly worse than phenom II. For being a brand new architecture with 2 billion transistors and a power hog being nearly as good as its predecessor which was never particularly strong at gaming counts as terrible.

Next year and the year after that people with sandy bridges will be okay, those with bulldozer will have upgraded by then due to necessity because it is weaker in gaming and the low res low settings tests show this. Running tests at standard 1080p proves nothing to gamers, throw in an athlon II dual core with those GPUs at the same settings, you will get the same results showing nothing about which CPU is more powerful when it comes to gaming.

This is the same situation we had years ago with phenom I and C2Q, today almost noone has a phenom I because it was a dog, many gamers still hang tough with their C2Q though.
The Phenom II X4 is still fine for gaming. Core2Quad also. People who got them are going strong. i am willing to bet that FX 8150 will be fine for the next two years as will be predicted by running multi-GPU now at really high resolutions.

i cannot see ONE situation where Sandy Bridge i7 2500K is any better in any PRACTICAL way in gaming - except in ARTIFICIAL situations that the reviewers make up.

i gotta agree with you on the original Phenom; it sucked
:whiste:
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
The Phenom II X4 is still fine for gaming. Core2Quad also. People who got them are going strong. i am willing to bet that FX 8150 will be fine for the next two years as will be predicted by running multi-GPU now at really high resolutions.

i cannot see ONE situation where Sandy Bridge i7 2500K is any better in any PRACTICAL way in gaming - except in ARTIFICIAL situations that the reviewers make up.

i gotta agree with you on the original Phenom; it sucked
:whiste:

Well when GPUs get more powerful a bulldozer will bottleneck them long before a sandy bridge will, hence why it is terrible for gaming. It offers yesterdays phenom II performance today, for a higher price, and more wattage... This cannot be seen when basing its performance entirely upon GPU limited benchmarks, that just disguises how much weaker bulldozer is at gaming. Like i said before throwing in an i3 or athlon II would yield the same result as a 2600k, despite them both being far less powerful CPUs.

The current iteration of bulldozer will not have the longevity of the C2Q series.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
i cannot see ONE situation where Sandy Bridge i7 2500K is any better in any PRACTICAL way in gaming - except in ARTIFICIAL situations that the reviewers make up.

Almost every review I've read has the 2500k beating the FX-8150 in most games, both at stock clocks and at 1920x1200/high graphics settings with something like a GTX 580. The 2500k is cheaper and consumes less power.

Yeah, so far, the 2500k is much more practical than the BD CPUs...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Well when GPUs get more powerful a bulldozer will bottleneck them long before a sandy bridge will, hence why it is terrible for gaming. It offers yesterdays phenom II performance today, for a higher price, and more wattage... This cannot be seen when basing its performance entirely upon GPU limited benchmarks, that just disguises how much weaker bulldozer is at gaming. Like i said before throwing in an i3 or athlon II would yield the same result as a 2600k, despite them both being far less powerful CPUs.

The current iteration of bulldozer will not have the longevity of the C2Q series.
That simply isn't true about Athlon II or i3. They are definitely weaker for gaming at 1080; i proved that a long time ago at resolutions that gamers use.

And define "long before".
- it'll be three changes of video cards before anyone will need to upgrade FX-8150
:whiste:

You can believe what you want to believe about BD being "weak" for gaming. That is just silly. i'd love to debate it further with you but i have an article to finish writing and much further research to do with FX-8150 and higher performing graphics

Aloha
 

Maximilian

Lifer
Feb 8, 2004
12,603
9
81
That simply isn't true about Athlon II or i3. They are definitely weaker for gaming at 1080; i proved that a long time ago at resolutions that gamers use.

And define "long before".
- it'll be three changes of video cards before anyone will need to upgrade FX-8150
:whiste:

You can believe what you want to believe about BD being "weak" for gaming. That is just silly. i'd love to debate it further with you but i have an article to finish writing and much further research to do with FX-8150 and higher performing graphics

Aloha

All that is being stressed at high settings is the GPU. CPUs arent breaking a sweat, bulldozer is a lousy gamer CPU. I cant predict the future but it will be obsolete before its intel counterparts. No sensible gamer would choose a bulldozer CPU at this point.

The benchmarks all say it, they dont even need to say it, the lower IPC than even phenom II spells it out to gamers. I dont remember where i saw the table but it showed 1 sandy bridge core with HT beats 2 bulldozer cores in IPC. Its a fail for games.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
It makes more sense to bench CPUs like you would normally play or bench the game any other time. Full resolution, highest settings. For example, why test with low resolutions and settings? You're never going to play like that. Those numbers don't really mean anything in real-world performance, which is what matters...
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
you do realise that you are just doing a gpu review using higher res and not putting the strain on the cpu.you need to remember that you are doing a cpu review and need to bench at a res where the cpu can show its power.

like running 3d01 at 1024 and keeping both cards clocks the same and to see what cpu gets a higher score,you know like back in the day when a64 came out and scored over 20k with a 9700 pro card.That was the first time we got to break 20k with basically stock clocked cards and that was all do to the strength of the a64 chip.

the whole point of doing a CPU review is to show its strengths in a situation where the cpu ipc makes a difference.

if one cpu scores higher in low res it will scale higher with more cards and more res.How can you be a reviewer and not know this,you obviuosly do and for some reason are baised towards making BD seem alot better than what every other review site is claiming.

Heck the way that your benching BD my old overclocked q6600 would show the same frames(to a certain res)like 1400 and then it would really drop off but its ok I would use a gtx 560 so my frames would still come close to matching a 2500k since my card is not pushing the cpu enough to bottle neck it.

run the res at 2560 on 2 screens if you really want to see if BD can keep up and see how fast at that res the frames drop.

edit turn on max aa setings ect also.heck if your going to run them at higher res you might aswell push it to the max and see what card runs faster on each platform.

10 bucks says you will see a huge increase on an overclocked 2600k vs an overclocked BD
 
Last edited:

Caza

Junior Member
Oct 8, 2011
12
0
0
You can believe what you want to believe about BD being "weak" for gaming. That is just silly. i'd love to debate it further with you but i have an article to finish writing and much further research to do with FX-8150 and higher performing graphics

Aloha

Your own benchmarks show an i7-920 spanking Zambezi in crossfire mode. Are you reading the same charts?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
why on earth would AMD willingly allow an unfavorable motherboard to be the test bed? even if Bulldozer ultimately isn't as bad because of a bad motherboard, if AMD couldn't figure out it was a problem (particularly after all these months of delay) I have to think that they're even dumber for it than I do now for releasing such a dud processor
 
Feb 19, 2009
10,457
10
76
So people buy expensive CPUs to run games at low res and low details? Since when is that common practice?

Frankly speaking, my Q9400 is freaken old and "obsolete", yet here i am enjoying BF3 at 1080 smooth as butter.

I think most of you over estimate the importance of the CPU in gaming. It always has been about the GPU. So save yourself some $$ and get a cheaper CPU that's capable and spend more for a better GPU/s, bigger SSD, more ram etc etc.

From comparing FX to SB, its a question of which APs do you use the most. Synthetics are fantastic for discussion points but for the average user its all about the software they use. Gaming is irrelevant, both architecture is great for gaming and it matters naught that SB is faster at low res. Meaningless.

If a CPU is not bottlenecking 3x top tier GPUs at high res/max details gaming, its going to be fine for the next few years.. by then, most users would have upgraded or got a new rig already so these points are invalid.

The only major flaw with FX is the huge power draw when OC, thats beyond ridiculous and the ONLY reason i would not consider buying it.
 

zlejedi

Senior member
Mar 23, 2009
303
0
0
What you don't seem to clue into is that GAMERS play at 1080 and up with powerful PCs and with as maxed settings as they can get away with.

FACT: i bench with GTX 580 SLI and HD 6970 TriFire-X3. And if that doesn't get bottlenecked, next year won't either.

FACT: There is no game you can point to that SB is playable at high resolutions and maxed details that BD is not (aside from driver issues)

Probably the most difficult issue for AMD is convincing people to upgrade from a fast quad Phenom II - they will be real bargains for gamers.

Fact:

Reviewer by choosing testing place in game can prove whatever he wants depending on if he chooses place limited by CPU or by GPU.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
If a CPU is not bottlenecking 3x top tier GPUs at high res/max details gaming, then it's fine for the next 3 years.

Except in this case it is. Apoppin's own charts show it. Take a look. The numbers boxed in red are where the i7-920 has a significant enough advantage that it would likely affect performance, including mininums. All the crossed out stuff is either a GPU bottleneck (which tells you nothing about CPU gaming performance) or the FPS are high enough that any difference is moot as performance would likely be unaffected (also I desreguarded the 8150's OC result because well obviously with it's performance per watt you'd be quite the tard to buy one for gaming.)

He's all about "real-world" benchmarking and that's cool because in his tests there's not a single game where having a 8150 would net a real-world performance increase over the 920, but quite a few that show the opposite. And let's not forget anyone building a new rig today would not be choosing between a 920 and 8150... Anand's bench shows both the 2500K and 2600K have a significant increase over the 920 for the few games tested (course this is at stock speeds...).




i cannot see ONE situation where Sandy Bridge i7 2500K is any better in any PRACTICAL way in gaming
Because you test with a 920. Perhaps?
 
Last edited:

MisterMac

Senior member
Sep 16, 2011
777
0
0
Except in this case it is. Apoppin's own charts show it. Take a look. The numbers boxed in red are where the i7-920 has a significant enough advantage that it would affect performance. All the crossed out stuff is either a GPU bottleneck (which tells you nothing about CPU gaming performance) or the FPS are high enough that any difference is moot as performance would likely be unaffected. He's all about "real-world" benchmarking and that's cool because in his tests there's not a single game where having a 8150 would net a real-world performance increase over the 920, but quite a few that show the opposite. And let's not forget anyone building a new rig today would not be choosing between a 920 and 8150... Anand's bench shows both the 2500K and 2600K have a significant increase over the 920 for the few games tested (course this is at stock speeds...).





Because you test with a 920. Perhaps?


Don't bother with him.


He's... well beep beep.


We all know GPU's make up a majority of gaming these days, so any top end CPU will run games fine.


He's trying to come to a niche geeky forum that cares about metric performances convincing people "Hey GUYS! it doesn't matter, it's all gpu!".


Like we're at bestbuy.



People with such lack of social understanding should be canned from posting.


So apoppin, go down to best buy and start spewing your "ahem".

And please do forget to mention that they will require to upgrade for 280 USD every 2 years, instead of 320 or 220 every 4/3 years.

It'll make sense! it will!

It's a dead end, limited CPU that WILL require changing within the future, a 2500k/2600k won't.

It may be fine for now, but no one except enthusiast have such a rapid upgrade cycle - get that thru your head.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The Phenom II X4 is still fine for gaming. Core2Quad also. People who got them are going strong.

I had a Q6600@3.2 and I replaced it because it was starting to bottleneck my system while gaming (single GTX 570). My 2500K is a big upgrade.

That said the Q6600 was a wonderful chip - when I got it years ago it blew everything away. I could have bought some competing brands cheaper chip which would have worked ok at the time but wouldn't have lasted like my Q6600 did. Kind of like me buying a 2500K and not a BD.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |