[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 46 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
I always go by TPU's maximum power consumption figures, not whatever the manufacturer claims the TDP is. Here's what they came up with for various GTX 980 cards:

  • Reference: 190W
  • MSI Gaming: 207W
  • Asus Strix: 199W
  • Gigabyte G1 Gaming: 342W (!!!)
  • Asus Matrix: 193W
As you can see, all of these cluster roughly around the same range, except the Gigabyte, which is a gross outlier - apparently this AIB disabled the chip's power limit functionality altogether, resulting in power consumption regressing to the GTX 480 days. I would describe the GTX 980's real TDP as 190W for the stock card, with factory OC'd versions naturally having higher figures. I had almost forgot that Nvidia was claiming 165W - that's definitely too low.

People who have been paying attention to such details have known for a long time NV's TDP figure is BS, well below spec. There was an article/interview back during the early Furmark throttling "because its a power virus, not representative of average gaming loads", NV's TDP is average gaming load. It's still under-spec.

Also, if a lowly 370/X has HBM, one would have to accept that its not a simple Tonga/Hawaii re-brand. Those chips would need to be redesign with a new memory subsystem to handle HBM. They may have the same SP count but everything else is different, no longer Tonga or Hawaii in fact.
 
Feb 19, 2009
10,457
10
76
I honestly don't think many people at all buy based off of performance/watt. Whenever it's mentioned in forums, it's typically in conjunction with other metrics like noise and heat as well as performance. GTX 980 for example, doesn't cost what it costs simply because it's performance/watt is greater than that of a 290x. It in fact, beats it in every metric I can think of. It's more efficient, it's faster, it's cooler it's quieter. That's before we even get into the developer/vendor penetration which NVidia has been much more successful at than AMD, and all this is before we even get into marketing, which again, nVIida has been more successful at than AMD.

Point being, there are a lot of reasons NVidia has been more successful at moving product than AMD. Performance/watt is but one, and IMO a rather small one compared to some of the other reasons.

Quiet & Cool depends highly on the models. Reference comparison, sure.

The 290X Vapor-X is quieter & cooler than most 970/980. This brings us to the point of terrible reference cooling on the R290/X hurting it massively. The stigma stuck.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
AMD never assigned any official TDP rating to the Hawaii cards. They told Anandtech that the "average gaming power scenario" is 250W, but the reviewer thought that the real TDP was "closer to 300W". TechPowerUp's tests indicate that AMD is right about average power consumption, but the actual TDP for the reference card in Uber mode is 315W.

But the problem with you using FurMark as we discussed on many occasions is no game, no mining / distributed computing / rendering program ever loads the GPU that much. You keep saying how FurMark is a real world program because you can run it on your GPU, but that's no different than running Intel's IBT on a 5960X HT. Since no real world program/task can load each register/spec of the GPU and memory the way FurMark max load can, the FurMark test is synthetic by definition. I think you and I discussed it how you can't load 100% of your shaders, ROPs, TMUs, L2 cache and memory controller at the same time but that's exactly what a power virus like FurMark is designed to do. Therefore, I really do not understand why you keep insisting on its relevance.

If R9 290X used 315W of power in games and 980 used 200W as you state, these numbers don't add up to their respective power usage when gaming in an i7 rig:

http://www.techspot.com/review/977-nvidia-geforce-gtx-titan-x/page9.html

Either way, you and I both agree on the conclusion that TDP does not always mean power usage -- TDP can understate it, match it or overstate it. That's been my point when people use 140W TDP for 970 and 165W for the 980 and start using rumored R9 300 series TDP for comparison. They can't be compared directly.
 
Last edited:
Aug 11, 2008
10,451
642
126
What you are saying would only be true if flagship cards went from using 250-275W of power to just 150-200W of power. That's not what's happening. What you described is exactly what NV wants us to believe. They created a new marketing strategy whereupon a videocard's worth is tied directly to its perf/watt, rather than its marginal utility (price/perf). This is a brilliant marketing strategy because once the consumer believes this marketing BS, NV can double prices but still sell you GTX560Ti (980) and GTX580 (Titan X) under the veil of perf/watt. Think about it, in the past did newer GPUs improve perf/watt over older ones? Absolutely but AMD/NV hardly used it as a major selling point to get you to upgrade. Today, perf/watt is marketed as THE most important factor for upgrading. It's not if the Titan X or a 980 use significantly less power than a GTX580 or 560Ti. They don't. Then why all of a sudden are they more attractive to warrant double the price of their historical lineage predecessors?

If you ask a new PC gamer, they wouldn't know that GTX560Ti was from the 680-980 lineage and cost $249 while GTX280/480/580 were from the Titan lineage and cost $499. The perf/watt marketing worked but NV still continues to sell 250W flagships, just today they ask double the price. I think AMD will still give us a flagship 250W card that is 40-50% faster than the R9 290X. AMD has remained the price/performance king since HD4850/4870 series and I don't see R9 300 changing that trend. Since R9 290X sells for $280-300 today, even if AMD releases a flagship at $550-650, it should be enough to cover the 500mm2+ die size and HBM and make more $ than they are currently making on those 290 cards.



While I hate to go off-topic on financials, in this case it's no topic. Increased manufacturing costs of larger die GPUs on lower nodes in no way offset the major price increases certain GPUs experienced in the last 5 years. NV's FY2010 gross margins were 35.4%, and they increased every single year to 55.5% as of FY2015. Therefore, the theory that GPU makers can't afford to manufacture GPUs with more transistors at similar die sizes as in the past doesn't fly. I truly believe perf/watt is used as a marketing tactic to justify the price increases today because performance increases have slowed down (took 3 years for 780Ti to double the 580). That means GPU makers cannot sell us on perf/watt as easily anymore (just look at the 960 vs. 760 = total disaster).

How do you market something a consumer doesn't really need? You need to devise a strategy that makes his/her existing product seem vastly inferior in some way so that he/she thinks it's outdated tech. Today, the easiest way to do this is perf/watt marketing. Even Intel is doing it. Intel will probably try to launch 35W Skylake CPUs that are nearly as fast as the i7 4690K/4770K. The focus on perf/watt suddenly makes your perfectly fine Haswell CPU look outdated. I think the focus on perf/watt today is because it's the easier metric to market and the easiest one to hype up because its hows the greatest improvement from 1 gen to another among all other metrics consumers actually understand. All of a sudden you can market a 35W CPU that's slightly faster than a 65W one twice as good!

The consumer is entitled to use whatever metric he wants to determine his purchase decision. The recurring theme that AMD makes great products but is bleeding marketshare in both cpus and gpus because consumers are uninformed, deluded by marketing, dont know the facts, etc., etc. gets really old after a while. The cpu market has proven that marketing old, outdated, inefficient products is the path to irrelevancy. They face the same fate in gpus unless R300 series makes huge improvements. Just because you justify every AMD product with walls of text "proving" their performance per dollar advantage does not mean the consumer has to accept that, or will not use other metrics as well to make their purchase decision.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
The consumer is entitled to use whatever metric he wants to determine his purchase decision. The recurring theme that AMD makes great products but is bleeding marketshare in both cpus and gpus because consumers are uninformed, deluded by marketing, dont know the facts, etc., etc. gets really old after a while. The cpu market has proven that marketing old, outdated, inefficient products is the path to irrelevancy. They face the same fate in gpus unless R300 series makes huge improvements. Just because you justify every AMD product with walls of text "proving" their performance per dollar advantage does not mean the consumer has to accept that, or will not use other metrics as well to make their purchase decision.

Same could be said for everything you just said.

It's marketing that's done it, not perf per watt.
 
Feb 19, 2009
10,457
10
76
The consumer is entitled to use whatever metric he wants to determine his purchase decision. The recurring theme that AMD makes great products but is bleeding marketshare in both cpus and gpus because consumers are uninformed, deluded by marketing, dont know the facts, etc., etc. gets really old after a while. The cpu market has proven that marketing old, outdated, inefficient products is the path to irrelevancy. They face the same fate in gpus unless R300 series makes huge improvements. Just because you justify every AMD product with walls of text "proving" their performance per dollar advantage does not mean the consumer has to accept that, or will not use other metrics as well to make their purchase decision.

This is true, for some, perf/w may not matter, but I don't think this applies to the majority. As such we're seeing Maxwell destroy the entire AMD stack.

Prior to the 970/980, AMD was doing alright R290/X vs 780/Ti, as well as the low/mid-range stuff. Market share creep back above 37% for AMD on a trajectory to 40%. Then the 970/980 landed and it was all downhill fast.

To think that efficiency isn't a winner with that record is wrong.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I honestly don't think many people at all buy based off of performance/watt. Whenever it's mentioned in forums, it's typically in conjunction with other metrics like noise and heat as well as performance. GTX 980 for example, doesn't cost what it costs simply because it's performance/watt is greater than that of a 290x. It in fact, beats it in every metric I can think of. It's more efficient, it's faster, it's cooler it's quieter. That's before we even get into the developer/vendor penetration which NVidia has been much more successful at than AMD, and all this is before we even get into marketing, which again, nVIida has been more successful at than AMD.

Point being, there are a lot of reasons NVidia has been more successful at moving product than AMD. Performance/watt is but one, and IMO a rather small one compared to some of the other reasons.

Attaching a tarnished brand name like AMD to ATI didn't help. Outside of a brief moment of glory vs. Intel's Netburst, AMD has always been viewed with skepticism if not outright derision. ATI has usually been second banana to NV over the years, but the gap wasn't as big as it was between Intel and AMD in CPUs, so the ATI brand wasn't as tarnished. I still think AMD took down the ATI branding way too many years in advance.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
I think Perf/Watt is a really important metric in thin notebooks, tablets, phones, and various small-form factor pc's. Heck, even supercomputer's have their Green500 to show off their efficiency.

When it comes to the desktop, I am not exactly interested in the most efficient clockspeed, but rather eeking out the most performance possible without exceeding too much voltage.

I love the perfomance/watt in my laptop and phone, but my desktop was not built for ultra efficiency.

If I had a tiny HTPC for gaming I would absolutely be focused on efficiency.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
I do think it's sort of silly that so much focus has been placed on efficiency at the mid-high end cards. It makes sense for the budget/low end cards because it's great to drop them into an cheap store bought desktop (see 750 Ti) but the vast majority of users are not pulling out the killowatt meters to see how much power they're using while they play GTA V

I would like to say that this is great marketing on NVIDIA's part, but I'm not so sure it is. It's more their users that have done the marketing for them on this front. I haven't seen such an emphasis in pc gaming forums on power efficiency & perf/w in.. well.. ever. If the rumors about the lower power usage for the 300 series are true though, then good for AMD because it seems like they need to be better or at least compete on every front to even be considered a purchase these days.
 

crashtech

Lifer
Jan 4, 2013
10,556
2,139
146
Attaching a tarnished brand name like AMD to ATI didn't help. Outside of a brief moment of glory vs. Intel's Netburst, AMD has always been viewed with skepticism if not outright derision. ATI has usually been second banana to NV over the years, but the gap wasn't as big as it was between Intel and AMD in CPUs, so the ATI brand wasn't as tarnished. I still think AMD took down the ATI branding way too many years in advance.
I don't think the AMD name did anything to tarnish the ATI brand that ATI's own history didn't already do to them. Many of ATI's older products (I'm talking late 90's here) were just outright crap. Even though they had the All-In-Wonder it was a wonder if you could get the software to work. From my (long) view, AMD has improved the quality of the drivers and software, albeit more slowly than optimal.
 

Contiusa

Junior Member
Oct 8, 2013
20
0
0
I do think it's sort of silly that so much focus has been placed on efficiency at the mid-high end cards. It makes sense for the budget/low end cards because it's great to drop them into an cheap store bought desktop (see 750 Ti) but the vast majority of users are not pulling out the killowatt meters to see how much power they're using while they play GTA V

I would like to say that this is great marketing on NVIDIA's part, but I'm not so sure it is. It's more their users that have done the marketing for them on this front. I haven't seen such an emphasis in pc gaming forums on power efficiency & perf/w in.. well.. ever. If the rumors about the lower power usage for the 300 series are true though, then good for AMD because it seems like they need to be better or at least compete on every front to even be considered a purchase these days.

I am going to jump in the bandwagon for a bit because I have been awaiting the new AMD series with expectation, but I own a GTX 770 Lightning, and one thing I love about it is that the card is cold as a corpse. It draws more than 220W and I am OK with it, but I would love to have a GTX 980 that draws only 160W. You get my point.

The more efficient they are the more cold and, well... Efficient they are. I look at my case and think of my GTX 770 sitting in its PCI-E slot and I can't help myself to think the card is a piece of art in all regards - performance, temperature and power efficiency (when the card was released). And this feeling sells, and sells big.

I would never update to graphics card that drew 300W form the wall and needed a water cooler solution to keep it below 70ºC unless it was a card to bring the Titan X to its knees and I had lots of money not only to buy it, but to customize a cooling solution for the whole rig. And how many people have money to do it?

And this is something that we pass along to other people, the so called word of mouth. Hence why middle / top range cards with optimal efficiency sell so well and are so well spoken of.

Cheers,
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Boy I'm finding the anticipation growing more and more, keep the leaks coming.

I can't help think that perhaps the efficiency metric is here to stay. It feels good to save power, and it's nice to avoid heating up your room in the summer. But i still remember a time long ago when it felt really nice to think how much power my computer was using. Like if you mod a car engine to burn tons of gas because it's just awesome. But, we are switching to a Tesla/Prius mindset with cars, driving efficiently. I can see that with computers too.

I decided not to get a 290 or 290X and just wait it out to see the 300 series release. Gah why can't Nvidia support mixed resolution displays in surround gaming
 

jji7skyline

Member
Mar 2, 2015
194
0
0
tbgforums.com
Boy I'm finding the anticipation growing more and more, keep the leaks coming.

I can't help think that perhaps the efficiency metric is here to stay. It feels good to save power, and it's nice to avoid heating up your room in the summer. But i still remember a time long ago when it felt really nice to think how much power my computer was using. Like if you mod a car engine to burn tons of gas because it's just awesome. But, we are switching to a Tesla/Prius mindset with cars, driving efficiently. I can see that with computers too.

I decided not to get a 290 or 290X and just wait it out to see the 300 series release. Gah why can't Nvidia support mixed resolution displays in surround gaming
Now you make me want to buy the most power hungry card AMD releases just to distance myself from the hybrid drivers
 

crashtech

Lifer
Jan 4, 2013
10,556
2,139
146
Each increment of power must eventually bring an increment of efficiency, else we will end up using liquid metal to cool our rigs. That said, I don't mind burning as much power as my rig can dissipate as long as the performance is there.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
Like if you mod a car engine to burn tons of gas because it's just awesome. But, we are switching to a Tesla/Prius mindset with cars, driving efficiently.

Hey now, I have a big 'ol modded 'Murican V8 and I'm trying my hardest to get the best gas mileage I can out of it... Best tank so far is 21MPG! :thumbsup:
 

SimianR

Senior member
Mar 10, 2011
609
16
81
I am going to jump in the bandwagon for a bit because I have been awaiting the new AMD series with expectation, but I own a GTX 770 Lightning, and one thing I love about it is that the card is cold as a corpse. It draws more than 220W and I am OK with it, but I would love to have a GTX 980 that draws only 160W. You get my point.

The more efficient they are the more cold and, well... Efficient they are. I look at my case and think of my GTX 770 sitting in its PCI-E slot and I can't help myself to think the card is a piece of art in all regards - performance, temperature and power efficiency (when the card was released). And this feeling sells, and sells big.

I would never update to graphics card that drew 300W form the wall and needed a water cooler solution to keep it below 70ºC unless it was a card to bring the Titan X to its knees and I had lots of money not only to buy it, but to customize a cooling solution for the whole rig. And how many people have money to do it?

And this is something that we pass along to other people, the so called word of mouth. Hence why middle / top range cards with optimal efficiency sell so well and are so well spoken of.

Cheers,

You definitely make some good points. I guess I just think that if I had to use a car analogy - the perfect one would be that no one buys a Ferarri for its gas mileage But I suppose if you can have the entire package then why not. Maybe the Tesla Model S is the perfect example of changing attitudes
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I am going to jump in the bandwagon for a bit because I have been awaiting the new AMD series with expectation, but I own a GTX 770 Lightning, and one thing I love about it is that the card is cold as a corpse. It draws more than 220W and I am OK with it, but I would love to have a GTX 980 that draws only 160W. You get my point.

The more efficient they are the more cold and, well... Efficient they are. I look at my case and think of my GTX 770 sitting in its PCI-E slot and I can't help myself to think the card is a piece of art in all regards - performance, temperature and power efficiency (when the card was released). And this feeling sells, and sells big.

I would never update to graphics card that drew 300W form the wall and needed a water cooler solution to keep it below 70ºC unless it was a card to bring the Titan X to its knees and I had lots of money not only to buy it, but to customize a cooling solution for the whole rig. And how many people have money to do it?

And this is something that we pass along to other people, the so called word of mouth. Hence why middle / top range cards with optimal efficiency sell so well and are so well spoken of.

Cheers,

Well it's also not that fast by today's standards and has a gigantic 2 slot cooler. It's good that you're happy with it, but you make it sound like GOD himself built the thing. The reality is that it's performance today is at best right in the middle of mid range and takes up as much or more space than today's high end cards, so yeah... It had better run cool.

Then we get to performance, it's direct competitor is what? The 7970GHz? Which outperforms it in just about everything and has an additional 1GB RAM to boot. When we get into heat, power and noise it's not any better either.

http://www.anandtech.com/show/6994/nvidia-geforce-gtx-770-review/16

I think you're deluding yourself as to the awesomeness of that card.
 
Last edited:

iiiankiii

Senior member
Apr 4, 2008
759
47
91
I am going to jump in the bandwagon for a bit because I have been awaiting the new AMD series with expectation, but I own a GTX 770 Lightning, and one thing I love about it is that the card is cold as a corpse. It draws more than 220W and I am OK with it, but I would love to have a GTX 980 that draws only 160W. You get my point.

The more efficient they are the more cold and, well... Efficient they are. I look at my case and think of my GTX 770 sitting in its PCI-E slot and I can't help myself to think the card is a piece of art in all regards - performance, temperature and power efficiency (when the card was released). And this feeling sells, and sells big.

I would never update to graphics card that drew 300W form the wall and needed a water cooler solution to keep it below 70ºC unless it was a card to bring the Titan X to its knees and I had lots of money not only to buy it, but to customize a cooling solution for the whole rig. And how many people have money to do it?

And this is something that we pass along to other people, the so called word of mouth. Hence why middle / top range cards with optimal efficiency sell so well and are so well spoken of.

Cheers,

First off, the gtx 770 is average at best today. A 2GB card? what!Q? In terms of efficiency, it ain't that great. The r9 280x has 3GB, is a bit faster overall, and consumes about the same amount of power. Your gtx 770 feels cool and quiet b/c of the gigantic cooler. That cooler is good enough to cooler a card over 300 watts. Don't let that fool you into thinking your card's TDP is that great. You can slap that cooler on most midrange GPU and get that cool and quiet effect. There are PLENTY of cards that is just as cool and quiet.

You're right. Improved efficiency is great. But top end cards will always draw more power than its mainstream card. The Titan X draws over 250watts under load. If you overclock it and overvolt, it can go upward to 400watts!! That's the nature of HIGH END graphics.

The thing about GPU progress is simple. It has to have improved efficiency to reasonably improve performance. It seems top cards are toping out at 250-300watts. I expect every generation to improve efficiency. I expect the upcoming 250-300watts card from AMD to compete well with the Titan X.
 
Last edited:

Contiusa

Junior Member
Oct 8, 2013
20
0
0
Well it's also not that fast by today's standards and has a gigantic 2 slot cooler. It's good that you're happy with it, but you make it sound like GOD himself built the thing. The reality is that it's performance today is at best right in the middle of mid range and takes up as much or more space than today's high end cards, so yeah... It had better run cool.

Then we get to performance, it's direct competitor is what? The 7970GHz? Which outperforms it in just about everything and has an additional 1GB RAM to boot. When we get into heat, power and noise it's not any better either.

http://www.anandtech.com/show/6994/nvidia-geforce-gtx-770-review/16

I think you're deluding yourself as to the awesomeness of that card.

How can you compare my old card with what they have today? You must be joking.

And from what I understand my card is way cooler than anything AMD had to offer back then, when they were already starting to be known as "hot" cards, which denotes lack of efficiency, a trend that seems to endure. Then Nvidia came with the GTX 900 series which is amazingly efficient, when the GTX 980 draws almost half of what the GTX 770 uses and is way stronger in performance.

I can't see how people will cheer anything that does not come close to it. And they won't.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
How can you compare my old card with what they have today? You must be joking.

And from what I understand my card is way cooler than anything AMD had to offer back then, when they were already starting to be known as "hot" cards, which denotes lack of efficiency, a trend that seems to endure. Then Nvidia came with the GTX 900 series which is amazingly efficient, when the GTX 980 draws almost half of what the GTX 770 uses and is way stronger in performance.

I can't see how people will cheer anything that does not come close to it. And they won't.

I actually compared it with a 7970 which is actually a bit older than your card. I sent you the link. Feel free to read it. 770 is slower and not doing any better in regards to power, heat and noise. Like I said, you're deluding yourself. Check the post right after mine, he's telling you almost exactly what I did. We aren't making it up.

If we look at the numbers, and if we are to believe what is important to you is actually what's important to you, then it appears that you purchased the wrong card.

The closer we look at this, the more it's sounding like you simply have an NVidia preference but attempted to make a post that had a façade of sounding neutral but really wasn't.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,556
2,139
146
I can't see how people will cheer anything that does not come close to it. And they won't.
I will, if it runs the numbers. Performance is what matters in this segment and whether the heat can be dissipated. Power consumption is a tertiary concern.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I will, if it runs the numbers. Performance is what matters in this segment and whether the heat can be dissipated. Power consumption is a tertiary concern.

Exactly, if price and performance between two competing cards is so close that it's pretty much a toss up, efficiency is basically the tie breaker.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Exactly, if price and performance between two competing cards is so close that it's pretty much a toss up, efficiency is basically the tie breaker.
so 50 watts difference under full load + 10-15% performance difference = 250$ price premium? you are ok with that? D: and 250/280 is 89%. that is almost the price of 2x290x for the price of 1 980. D: you are ok with that?

top tier always require a price premium, but 10-15% performance difference? come on. you might as well give your money to charity if you are gonna spend it this way. do some good with it.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
so 50 watts difference under full load + 10-15% performance difference = 250$ price premium? you are ok with that? D: and 250/280 is 89%. that is almost the price of 2x290x for the price of 1 980. D: you are ok with that?

top tier always require a price premium, but 10-15% performance difference? come on. you might as well give your money to charity if you are gonna spend it this way. do some good with it.

I'm going to quote myself and make a couple enhancements since I seemed to have lost you at some point during the entire 1 sentence of my post.

Exactly, if price and performance between two competing cards is so close that it's pretty much a toss up, efficiency is basically the tie breaker.

I hope that clears it up.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |