Discussion Speculation: Zen 4 (EPYC 4 "Genoa", Ryzen 7000, etc.)

Page 434 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Vattila

Senior member
Oct 22, 2004
805
1,394
136
Except for the details about the improvements in the microarchitecture, we now know pretty well what to expect with Zen 3.

The leaked presentation by AMD Senior Manager Martin Hilgeman shows that EPYC 3 "Milan" will, as promised and expected, reuse the current platform (SP3), and the system architecture and packaging looks to be the same, with the same 9-die chiplet design and the same maximum core and thread-count (no SMT-4, contrary to rumour). The biggest change revealed so far is the enlargement of the compute complex from 4 cores to 8 cores, all sharing a larger L3 cache ("32+ MB", likely to double to 64 MB, I think).

Hilgeman's slides did also show that EPYC 4 "Genoa" is in the definition phase (or was at the time of the presentation in September, at least), and will come with a new platform (SP5), with new memory support (likely DDR5).



What else do you think we will see with Zen 4? PCI-Express 5 support? Increased core-count? 4-way SMT? New packaging (interposer, 2.5D, 3D)? Integrated memory on package (HBM)?

Vote in the poll and share your thoughts!
 
Last edited:
Reactions: richardllewis_01

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,740
14,772
136
@Doug S What you describe is business as usual. In Germany cost for electric power increased over four times within the last two years, that's not business as usual.
And as I said before, you double the CPU wattage, and it costs you 6x. 2x cpu power, 2x cpu cooling, 2x cpu generator backup. And now with power at a premium, that 4x for power is like 16x. Things WILL change in the data centers, or people ARE going to get fired.

Edit, and thats best case, assuming Intel could keep up in performance at 2x the wattage. With Genoa, that will not be true.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,740
14,772
136
Source for those cost multipliers?
If the wattage is double (fact for a given CPU) then the heat will be twice thus AC cost twice. And generator backup, same, as twice the power needs twice the generators. I don't need more than common sense for this math. If its not exact due to the heat, that could a little off, but not much. This has been true since I worked in a data center for years.

Edit: and if you are that sure they are wrong, why don't YOU come up with some stats for proof of your opinion for once.
 

Exist50

Platinum Member
Aug 18, 2016
2,452
3,101
136
then the heat will be twice thus AC cost twice
And generator backup, same, as twice the power needs twice the generators
In both of these cases, you implicitly assumed that the cost of AC and backup exactly equals the actual electricity cost for a given power consumption, which is a silly assumption. A generator does not cost 5x as much just because electricity is 5x as expensive.

You're also assuming that those costs scale linearly with power consumption, when they certainly do not. Especially for intermittent things like backup power.

Hell, you even assume that a datacenter is 100% CPU power. Memory, storage, accelerators, etc. are all non-negligible.
Edit: and if you are that sure they are wrong, why don't YOU come up with some stats for proof of your opinion for once.
Unlike you, I don't make up numbers for that which I do not know. But I don't need to know them to point out the flaws in your math.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,740
14,772
136
In both of these cases, you implicitly assumed that the cost of AC and backup exactly equals the actual electricity cost for a given power consumption, which is a silly assumption. A generator does not cost 5x as much just because electricity is 5x as expensive.

You're also assuming that those costs scale linearly with power consumption, when they certainly do not. Especially for intermittent things like backup power.

Hell, you even assume that a datacenter is 100% CPU power. Memory, storage, accelerators, etc. are all non-negligible.

Unlike you, I don't make up numbers for that which I do not know. But I don't need to know them to point out the flaws in your math.
First, anybody with a brain knows those were approximations. And aside from spinning drives which are not as common today, The rest is negligible. The CPU is the most intensive and greatest power hog in todays systems (again, aside from spinning drives if used)
 

Exist50

Platinum Member
Aug 18, 2016
2,452
3,101
136
First, anybody with a brain knows those were approximations.
If so, those "approximations" are so wildly off base as to be useless. This isn't quibbling over a couple percent.
And aside from spinning drives which are not as common today, The rest is negligible.
They most certainly are not. CPUs are the most power-hungry single component in non-accelerator nodes, but that doesn't mean the rest is negligible. Also, plenty of HDDs are still being used.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,740
14,772
136
If so, those "approximations" are so wildly off base as to be useless. This isn't quibbling over a couple percent.

They most certainly are not. CPUs are the most power-hungry single component in non-accelerator nodes, but that doesn't mean the rest is negligible. Also, plenty of HDDs are still being used.
OK, so you want to be anal. Here is a link: https://www.arcserve.com/blog/data-centers-what-are-costs-ownership

So the single biggest cost in a data center is electricity. Even if only half of that was CPUs, doubling that would add 10% to your data center cost. Oh and power, server and hvac equipment is number 2 at 24% (total). So lets estimate 16% for power and hvac. That would double. So then you are at 26% more cost by using servers that use twice the power for the CPUs. Even if these estimates are off, any financial person would have a fit with the increase in costs.

THIS IS THE POINT I AM TRYING TO MAKE. Genoa will save companies a lot of money on a daily basis.
 
Reactions: Tlh97 and moinmoin

Yosar

Member
Mar 28, 2019
28
136
76
54 game comparison between the 7600X and 13600K.

7600X holds the slight advantage, even with faster memory for Intel and does it using less power.


That's why gaming tests in launch reviews should be totally ignored. Unless they have a really big number of games tested (and usually it's not true due to time constraints).
It's not a surprise it's actually the same what AMD said on their slides (5% faster than 12900K, and many reviews put 13 600K on the same level as 12 900K, and sometimes higher).
After all they had enough time to thoroughly benchmark both processors, not like most 'independent reviews'.

And some reviews were calling 13 600K best gaming CPU... Well king was dethroned quite quickly I suppose. Or maybe the emperor turned out naked.
There is nothing special in those results confirming that 13 600K is some monstrous gaming CPU (well unless much bigger power draw counts).
AMD didn't even need X3D Zen 4 processors.
I got a feeling it won't be pretty picture for Raptor Lake when Zen 4 X3D will launch if even Raptor Lake couldn't beat Zen 4 in gaming.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
That's why gaming tests in launch reviews should be totally ignored. Unless they have a really big number of games tested (and usually it's not true due to time constraints).
It's not a surprise it's actually the same what AMD said on their slides (5% faster than 12900K, and many reviews put 13 600K on the same level as 12 900K, and sometimes higher).
After all they had enough time to thoroughly benchmark both processors, not like most 'independent reviews'.

And some reviews were calling 13 600K best gaming CPU... Well king was dethroned quite quickly I suppose. Or maybe the emperor turned out naked.
There is nothing special in those results confirming that 13 600K is some monstrous gaming CPU (well unless much bigger power draw counts).
AMD didn't even need X3D Zen 4 processors.
I got a feeling it won't be pretty picture for Raptor Lake when Zen 4 X3D will launch if even Raptor Lake couldn't beat Zen 4 in gaming.

You are spot on!
Zen 4 is already better at gaming than RPL.
The 13900k puts a tougher fight, but the rest down the product stack lose to their Zen 4 counterparts

I want to add that while several reviews put the 13600k ahead, a few of them put Zen 4 on top.
The biggest difference here is that in addition to the number of games tested, many of the "reviewers" don't have a clue on how to setup a Zen 4 system
In absolute numbers, there is little variance in the RPL results among all the reviewers. A more mature platform with more consistent results.
However, the numbers for Zen 4 are all over the place. While a few outlets (tech spot in particular) were able to configure their Zen4 machine properly and show them leading the pack, most places did not know how to set up Ryzen 7000 systems.
The numbers obtained by the incompetent reviewers for Zen 4 are much lower than the outlets who did configure the systems properly.
 

DrMrLordX

Lifer
Apr 27, 2000
21,797
11,143
136
The biggest difference here is that in addition to the number of games tested, many of the "reviewers" don't have a clue on how to setup a Zen 4 system

That's part of it. Also some of the launch day 13900k reviews had 4090s while the 7950X launch day reviews used 3090Tis, and somehow the 7950X et al. showed lower numbers with the 4090 . . . ? Something was very wrong.

You're still going to get some people strutting around here claiming Raptor Lake is 11-30% faster in games, it's faster in applications (?!?!), etc. It's a bit tiresome.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You are spot on!
Zen 4 is already better at gaming than RPL.
The 13900k puts a tougher fight, but the rest down the product stack lose to their Zen 4 counterparts

Hold your horses. That BFV benchmark he did obviously has something wrong with it as it doesn't make sense. There is no way on God's green Earth that the 7600x is 36% faster than the 13600K in BF5 and then 6% slower in BF 2042.

Steve should have thrown out that result as it's not possible. Those kinds of extreme outliers are obviously incorrect. I would say the same for Horizon Zero Dawn and some of the others. It might be a problem with the efficiency cores getting in the way, but there is no way the 7600x leads by such a huge margin in those games.

And just looking at the game selection, the 13600K is more dominant in CPU demanding games. Call of Duty MW2 MP performs better on the 13600K compared to the single player game as the MP is more CPU demanding. Raptor Lake has a stronger core than Zen 4 in terms of raw CPU power.

I usually like HWU, but no one else is coming out with these weird results and publishing them.
 
Reactions: Tlh97 and yuri69

In2Photos

Golden Member
Mar 21, 2007
1,665
1,682
136
Hold your horses. That BFV benchmark he did obviously has something wrong with it as it doesn't make sense. There is no way on God's green Earth that the 7600x is 36% faster than the 13600K in BF5 and then 6% slower in BF 2042.

Steve should have thrown out that result as it's not possible. Those kinds of extreme outliers are obviously incorrect. I would say the same for Horizon Zero Dawn and some of the others. It might be a problem with the efficiency cores getting in the way, but there is no way the 7600x leads by such a huge margin in those games.
He said if you take out BF5 the 7600X s still 4% faster overall. It's at 9:55 in the video.
And just looking at the game selection, the 13600K is more dominant in CPU demanding games. Call of Duty MW2 MP performs better on the 13600K compared to the single player game as the MP is more CPU demanding. Raptor Lake has a stronger core than Zen 4 in terms of raw CPU power.

I usually like HWU, but no one else is coming out with these weird results and publishing them.
So you prefer the reviews where they cherry pick the titles to show Intel as superior? Got it
 

eek2121

Diamond Member
Aug 2, 2005
3,045
4,266
136
And I wouldn't be surprised if some people end up getting fired 1-2nd quarter next year when businesses get hit with a 20-30% electricity increase in the US. If they haven't seen their electrical cost go up, they're either not looking or it hasn't filtered to them yet. Not as bad as EU, however its coming.

From what I gather, infrastructure in the US is quite different from the EU. For starters, nearly all the electricity in my area comes from hydroelectric, nuclear, and solar. Only a small amount comes from fossil fuels. Depending on where you are in the US, energy generation may range from renewables to nuclear, natural gas, and even coal. Also, the U.S. does not rely on other countries for natural gas or coal.

FWIW My electricity cost hasn't risen by a dime in at least 3 years (the oldest bill I could find). I pay $0.11/kw after all the fees and stuff.
 

Kaluan

Senior member
Jan 4, 2022
503
1,074
106
Hold your horses. That BFV benchmark he did obviously has something wrong with it as it doesn't make sense. There is no way on God's green Earth that the 7600x is 36% faster than the 13600K in BF5 and then 6% slower in BF 2042.

Steve should have thrown out that result as it's not possible. Those kinds of extreme outliers are obviously incorrect. I would say the same for Horizon Zero Dawn and some of the others. It might be a problem with the efficiency cores getting in the way, but there is no way the 7600x leads by such a huge margin in those games.

And just looking at the game selection, the 13600K is more dominant in CPU demanding games. Call of Duty MW2 MP performs better on the 13600K compared to the single player game as the MP is more CPU demanding. Raptor Lake has a stronger core than Zen 4 in terms of raw CPU power.

I usually like HWU, but no one else is coming out with these weird results and publishing them.
Holy cherry picking 😂
As if a 1/54th outlier would change the outcome that much (HU already covered it anyway).

Edit: BFV and BF2042 may use the same family of engines, but they're not the same are they? Codemaster's F1 and DIRT games also use the same family of engines between them, that doesn't mean we don't see wild variance in results between AMD, Intel and nVidia GPUs and CPUs. This is a non-argument.

What's next? Factorio and MMOs are banned in V-Cache testing? E-Sports titles banned on AMD testing?

TPU just did a E-core on v off big round-up, and Raptor Lake performs virtually the same in games in both situations.

And if those "arguments" fail, guess you'll just rant on how if you use $400+ 7600MT kits, 13600K would be faster or something😊

While ignoring that the kit they did the 13600K 54 game tests on would make the 7600X even faster in the first place 😁

 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,808
4,092
136
He said if you take out BF5 the 7600X s still 4% faster overall. It's at 9:55 in the video.

So you prefer the reviews where they cherry pick the titles to show Intel as superior? Got it

Holy cherry picking 😂
As if a 1/54th outlier would change the outcome that much (HU already covered it anyway).

What's next? Factorio and MMOs are banned in V-Cache testing? E-Sports titles banned on AMD testing?

TPU just did a E-core on v off big round-up, and Raptor Lake performs virtually the same in games in both situations.

And if those "arguments" fail, guess you'll just rant on how if you use $400+ 7600MT kits, 13600K would be faster or something😊

While ignoring that the kit they did the 13600K 54 game tests on would make the 7600X even faster in the first place 😁

View attachment 71078

Just follow the post history. Quite easy to see who has an agenda. Yes, that goes for AMD as well.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
He said if you take out BF5 the 7600X s still 4% faster overall. It's at 9:55 in the video.

I know, and he also said he would look into it more because it was anomalous. But it's not just BF5. HZD is also anomalous.

Whenever games show such a large swing you have to question why. It makes no sense. In his original comparison in October during the Raptor Lake launch, the 5950x is ahead of the 13900K with DDR5-6400!

Now if that doesn't raise flags, what will? The game is likely leaning on the E cores. No other explanation makes sense. It's too bad I don't own the game as I would run some tests on it.

The 13600K gets steamrolled in Horizon Zero Dawn by a 28% margin in favor of the 7600X, it's bizarre how well the Zen 4 CPUs perform in this game, it's certainly an outlier in our limited 12 game testing, but a strong result for AMD all the same.





So you prefer the reviews where they cherry pick the titles to show Intel as superior? Got it

Don't be ridiculous. There's a thing called confirmation bias and another thing called critical thinking. You are displaying the former and I the latter. Think about it, a 5950x is beating a 13900K......
 
Jul 27, 2020
17,849
11,642
116
Think about it, a 5950x is beating a 13900K......
Honestly, I'm not surprised. It's a console port from a console with an AMD CPU. What more explanation is needed?

The non-conspiracy theory explanation: the devs wanted to maximize their game engine performance on all available cores so they built the engine from the ground up to scale with available cores. However, the only CPU they tested that engine was AMD and they made some assumptions in code based on available hardware. Now the game runs best on Zen architecture. For the PC port, they didn't bother or didn't want to go to the trouble of optimizing again for Intel CPUs. It worked fine in their estimation so that's how they shipped the game.

Personally, I think it's only fair to let AMD users have an unfair advantage for once. Almost every PC game out there is written for Intel first and AMD CPUs have to run those games with the strength of their architecture alone without the benefit of specific optimizations.

I would love to hear from game developers on these forums if they have ever bothered to read through AMD's CPU optimization manuals.
 
Last edited:
Reactions: Tlh97 and Yosar

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Holy cherry picking 😂
As if a 1/54th outlier would change the outcome that much (HU already covered it anyway).

As I said above, HZD is also another one.

How are you going to spin the 5950x beating a 13900K+DDR5? I realize that this is an AMD thread and even the entire forum leans AMD, but even you have to admit that makes no sense.

Both CPUs are x86-64 so you can't pin it down to optimization issues either.

Edit: BFV and BF2042 may use the same family of engines, but they're not the same are they? Codemaster's F1 and DIRT games also use the same family of engines between them, that doesn't mean we don't see wild variance in results between AMD, Intel and nVidia GPUs and CPUs. This is a non-argument.

BF2042 is much more CPU demanding and has a later version of the Frostbite engine. BF5 can achieve much higher FPS than BF2042 due to being less demanding.

What's next? Factorio and MMOs are banned in V-Cache testing? E-Sports titles banned on AMD testing?

Factorio has an explanation in that it runs primarily out of the cache, which is why the 5800x3D is so dominant.

TPU just did a E-core on v off big round-up, and Raptor Lake performs virtually the same in games in both situations.

The E cores is a possibility, but such a big discrepancy looks to me more like a configuration problem or a bug in the game itself.

And if those "arguments" fail, guess you'll just rant on how if you use $400+ 7600MT kits, 13600K would be faster or something😊

I realize you're being flippant, but this is just an observation by me because it obviously doesn't add up when you think about it. In HZD, a 5950x with less IPC, DDR4 and lower clock speeds is beating the fastest gaming CPU that is currently available.

While ignoring that the kit they did the 13600K 54 game tests on would make the 7600X even faster in the first place 😁

I don't own a Zen 4 CPU, but even I know that Zen 4 runs best in a 1:1 gear mode, which makes DDR5-7600 useless for Zen 4.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Honestly, I'm not surprised. It's a console port from a console with an AMD CPU. What more explanation is needed?

Game optimization doesn't work that way in PC space. Developers don't target specific CPUs for optimization, they target ISAs.

AMD and Intel both share the same basic x86-64 ISA so there shouldn't be any optimization issues. The size of the performance deficit to me indicates a bug in the game itself or a configuration/settings issue.

The non-conspiracy theory explanation: the devs wanted to maximize their game engine performance on all available cores so they built the engine from the ground up to scale with available cores. However, the only CPU they tested that engine was AMD and they made some assumptions in code based on available hardware. Now the game runs best on Zen architecture. For the PC port, they didn't bother or didn't want to go to the trouble of optimizing again for Intel CPUs. It worked fine in their estimation so that's how they shipped the game.

As I said earlier, PC game development don't target specific CPUs or micro-architectures. They target the ISA itself, which x86-64. They do the same for GPUs as well, but they target the graphics API. They don't optimize for Ampere or Navi, they optimize for DX12, DX11 or Vulkan.

Only on consoles do they specifically target the hardware at the lowest levels and only in exclusive titles like God Of War, Gears 5 etcetera which aren't cross platform.
 

Timorous

Golden Member
Oct 27, 2008
1,727
3,152
136
I know, and he also said he would look into it more because it was anomalous. But it's not just BF5. HZD is also anomalous.

Whenever games show such a large swing you have to question why. It makes no sense. In his original comparison in October during the Raptor Lake launch, the 5950x is ahead of the 13900K with DDR5-6400!

Now if that doesn't raise flags, what will? The game is likely leaning on the E cores. No other explanation makes sense. It's too bad I don't own the game as I would run some tests on it.









Don't be ridiculous. There's a thing called confirmation bias and another thing called critical thinking. You are displaying the former and I the latter. Think about it, a 5950x is beating a 13900K......

The thing is Hub have used a 4090. Other outlets that used Ampere cards got very different results but Igor who went with a 6950XT saw the 7700X 6% ahead of the 13600K on average.

Personally I think any tests using Ampere GPUs with the pre 522 driver are all out of date given the new driver and the 4000 series cards with the fixed drivers.
 
Jul 27, 2020
17,849
11,642
116
Only on consoles do they specifically target the hardware at the lowest levels.
That's my point. When they were developing playstation exclusive games, they couldn't have known that one day Sony would turn around and ask them to release PC versions. So they basically "tied" the game code to AMD architecture (specific optimizations that give the best performance for multicore workloads).

But yes, in the case of ADL and RPL, it is quite possible that Windows 11 is scheduling critical game threads on the E-cores which may be causing the slowdown. It is also possible that the P-cores are getting dragged down by the latency of waiting for results from threads executing on E-cores. On AMD, all cores are equal in processing power so all the threads would get along fine. On ADL and RPL, the P-core threads are suffering due to slow execution of game code on E-core threads. This is my plausible theory.
 
Reactions: Tlh97 and Schmide

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's my point. When they were developing playstation exclusive games, they couldn't have known that one day Sony would turn around and ask them to release PC versions. So they basically "tied" the game code to AMD architecture (specific optimizations that give the best performance for multicore workloads).

They would have to be completely incompetent to do such a thing and not change it for the PC port. The HZD port wasn't done by Guerilla Games, the game's developer who only developers for the PlayStation. It was offloaded to another developer that specializes in doing ports and presumably wouldn't make such an idiotic decision.

In fact, if they did such a thing, I doubt it would even run on non AMD hardware. That's obviously not the case though and the game runs on Intel as well, and at high framerates. And what about BF5? That game was handled by EA Dice who has been doing primary development for the PC platform for years.

But yes, in the case of ADL and RPL, it is quite possible that Windows 11 is scheduling critical game threads on the E-cores which may be causing the slowdown. It is also possible that the P-cores are getting dragged down by the latency of waiting for results from threads executing on E-cores. On AMD, all cores are equal in processing power so all the threads would get along fine. On ADL and RPL, the P-core threads are suffering due to slow execution of game code on E-core threads. This is my plausible theory.

Who knows what it is. All I know is that it needs further investigation. Things like this can affect AMD as well. When the shoe is on the other foot, I bet they won't have the same response that they've had in this thread to me making a stink about HWU's testing.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The thing is Hub have used a 4090. Other outlets that used Ampere cards got very different results but Igor who went with a 6950XT saw the 7700X 6% ahead of the 13600K on average.

Personally I think any tests using Ampere GPUs with the pre 522 driver are all out of date given the new driver and the 4000 series cards with the fixed drivers.

I'm not saying the 7600x can't be faster than the 13600K in gaming. Don't get me wrong. What I'm saying is that HWU's benchmarks for HZD and BF5 are clearly anomalous and need further looking into.

It's just those two benchmarks that look to be off.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |