[ PcPer ] AMD Radeon R9 290X Now Selling at $299

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
I see we share the same opion across the entire subject. I dumped my 980 because I felt like while more powerful than my 290x Lightning the difference was only really seen in some benchmarks aside from what my power meter said. 980 and 970 are impressive from a technological standpoint but will soon be forgotten because , as you put it, they are filler cards. Both merely offering the same overall performance level already obtainable just at better price points and lower power which screams new midrange.

If I were buying a card right now, I would probably snag a 290(x) or 780(ti) as the prices have dropped (new or used) to where they are actually the bargains compared to the new cards.

Even the 295x2 and Z are at prices where if single card was desirable for lots of performance they are reasonable considerations.


The 295X2 can be had for £599.99.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
No, I'm pretty sure what I said was right. Some non reference models run at 250w, and also, the testing methodologies may not be that good for some sites, so it's possible the big difference doesn't show up some times. This is a problem for many tonga and maxwell models, where one of the selling points is efficiency.

I'm not sure I understand what you are saying
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
No, I'm pretty sure what I said was right. Some non reference models run at 250w, and also, the testing methodologies may not be that good for some sites, so it's possible the big difference doesn't show up some times. This is a problem for many tonga and maxwell models, where one of the selling points is efficiency.

It's entirely false. Sure the 970 is more efficient, that's not the question but it's not 150w vs. 300w.

My system with 5 HDDs/ssds, dual monitors, dual 290x reference cards (the most inefficient 290x) uses 600-700w gaming. Your claim has some serious lack of factual data.

The techspot just shows system power consumption and there's a 10% difference or so, not 100% difference. I know it can be more in a best case scenario, but show me your proof if you want to continue the fud since it's not even close to accurate.

/done until you present proof of your claim
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
It's entirely false. Sure the 970 is more efficient, that's not the question but it's not 150w vs. 300w.

My system with 5 HDDs/ssds, dual monitors, dual 290x reference cards (the most inefficient 290x) uses 600-700w gaming. Your claim has some serious lack of factual data.

The techspot just shows system power consumption and there's a 10% difference or so, not 100% difference. I know it can be more in a best case scenario, but show me your proof if you want to continue the fud since it's not even close to accurate.

/done until you present proof of your claim

It is in those situations where you realize that Nvidia lying marketing strategies work with consumers.

They make you believe something, then people buy in mass = BIG WIN for them.

It's true that the GTX 970 is efficient but far from what they claim.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Imo, hawaii was one of the worst architectures in recent history. MSI has a 970 that they rate at 148 watts, meaning two of them in SLI still uses a tad less power than a 290x. It's embarrassing. I feel I'm locked into AMD cards because of mantle, but I feel the best amd cards are used ~$100 radeon 7950's from decommissioned mining rigs.


I think a very good buy at the moment are the 280x used from ebay,I picked one up myself cheap and still under warranty ,good upgrade over my MSI 560Ti card,also Nvidia 970 coil whine issues put me off that so decided to wait until next year for R9 390s and rest of Nvidia's new range.

Thread on coil whine here at OCUK,http://forums.overclockers.co.uk/showthread.php?t=18630402


Btw happy with my Sapphire 280x dual X OC card,no drivers issues especially going from Nvidia to AMD .
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Imo, hawaii was one of the worst architectures in recent history. MSI has a 970 that they rate at 148 watts, meaning two of them in SLI still uses a tad less power than a 290x. It's embarrassing.

That's BS. What 970 card uses 148W of power? There is no such card! You just bought into NV's marketing claims of TDP = power consumption. In reality everyone who followed GPUs for the last 20 years knows the opposite is true => TDP does NOT equal power consumption. TDP stands for Thermal Design Point, not power usage.

Here is a recent review of 970 SLI vs. 290 CF at 4K which stresses the cards to the limits. Where are these 200-300W of power differences you speak of?





Most reviewers have failed the consumers and should be ashamed of themselves by pitting reference 947mhz R9 290 cards that run hot and loud and use way too much power (because they run hot and there a relationship between heat and power usage of an ASIC). Of course if your job is to sell the latest and greatest product and get a kick back from positive reviews, I can understand. But why have most reviewers, unlike TechSpot, failed to revisit the price/performance of R9 290s vs. 970 SLI 1 month after launch due to price adjustments?

BTW, most reviewers also failed to revisit after-market 7950 V2, 7970Ghz editions against 670/680. Why is that?

At $230-250, an after-market R9 290 is a bargain compared to 970/980.

It's true that the GTX 970 is efficient but far from what they claim.

HardOCP confirms that GTX970 does not have power load balancing of 980 reference cards. That's why after-market 970s cannot claim the efficiency of 980 in reviews. ** The chart below uses reference R9 290s which again is per usual the typical practice of reviewers....yawn.

 
Last edited:

mindbomb

Senior member
May 30, 2013
363
0
0
that review was using non reference cards with higher tdp values. Similiarly, many r9 285 reviews were also done with cards that strayed from AMD's official tdp value. I think the real problem for consumers is with AIB's taking too many liberties with their designs. Nvidia and AMD both need to have some stricter quality control. Anyway, here is the 148 watt 970, http://us.msi.com/product/vga/GTX-970-GAMING-4G.html#hero-specification , presumably the measurement was taken in silent mode.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
It is amazing how important measuring power has become in these forums. I'm going to head over to the Ferrari foums and see how many people are bitching about fuel consumption.

Is this the HTPC section?
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Nvidia and AMD both need to have some stricter quality control. Anyway, here is the 148 watt 970, http://us.msi.com/product/vga/GTX-970-GAMING-4G.html#hero-specification , presumably the measurement was taken in silent mode.


To be fair you can't blame them for it all,ie their partners that make cards for them have to take some of the blame, ie whine issue for example with 970s you can blame both Nvidia and third party partners.

Both Nvidia and AMD have their reference design guidelines,but it's up to their partners if they stick to it or want to improve on the reference design.
 

l2ez4m

Member
Aug 25, 2012
47
0
66
No offense Russian but You should ditch those ''trusty Killawatt'' wallpapers for good. Give us some proper data deserving your status.
And when we compare actual aftermarket sku's using little bit more sophisticated methodology, the difference is apparent:





So on average 970 is ~15% faster @1080p, ~10% @1440p once overclocked, runs a bit cooler, less noisier due to those extra wattz consumed on Hawaii, throttles less once overclocked, brings less thermal strain, acoustic contamination on overall platform, ie cpu/mobo, case ventilation ect.

So in the long run - lets say a year - i'd be glad to pay those measly extra 5$/month just for pure personal comfort levels, totally ignoring all brandloyalistic agenda.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So in the long run - lets say a year - i'd be glad to pay those measly extra 5$/month just for pure personal comfort levels, totally ignoring all brandloyalistic agenda.

Your review only backs up the data I provided - 50W power difference on 1 card but in SLI/CF it'll be less than 50W extra for the 2nd card since SLI/CF are not 100%. Of course 970 SLI is more efficient no doubt but you are paying a lot of $ to save those 50-100W. It'll take 5+ years of hardcore every day gaming to recoup $100-200 in extra $ spent on electricity on 970/970SLI.

As I already said, after-market R9 290 runs cool. If it makes you feel better that your card runs at 61*C instead of 73*C, then I can't really change your mind. At the end if you feel more peace of mind running your card overclocked at 75C vs. 85C, that's personal preference. All things being equal I would take a cooler running card but in this case all things aren't equal due to large pricing difference.

Remember AMD specified Hawaii for 95*C at full load operation and max overclocked after-market R9 290 doesn't even approach that.


Secondly, you said $5 a month over 1 year but it's not $5 since the cheapest R9 290s have dropped to $230-260 vs. $330-370 for 970. That's on a per card basis but what if you are going dual?

Think about it both the 970 SLI and R9 290 CF will become outdated at the same time. When it comes time to upgrade in 1.5-2 or 2.5 years, you will have $200 more to buy something faster. Alternatively, that's $200 you use to buy Steam games this Winter sale, upgrade i5 to i7, get a 3TB hard drive or a 512GB SSD or a new case. My point is if you keep paying $200-300 extra every 2 years for NV, over time it adds up to thousands of dollars. To each his own but I think most reviewers have done a disservice to the gamer because:

- They continue to test reference AMD models against factory pre-overclocked after-market NV models, an invalid comparison which has little to do with reality of how after-market R9 290 (or 7970Ghz for that matter) compare in terms of temperatures and noise levels and performance against NV's competing SKUs

- Because they test reference models, the cards throttle, which is not representative of R9 290 after-market card's performance

- They ignore the value added of game bundles that come with price reduced GTX780Ti and R9 290 cards.

- They get overly excited about marketing features like DX12 while ignoring where the chip sits - GM204 is just mid-range but they fail to criticize and comment how overpriced it appears for a mid-range next gen product. Why?

that review was using non reference cards with higher tdp values. Similiarly, many r9 285 reviews were also done with cards that strayed from AMD's official tdp value. I think the real problem for consumers is with AIB's taking too many liberties with their designs. Nvidia and AMD both need to have some stricter quality control. Anyway, here is the 148 watt 970, http://us.msi.com/product/vga/GTX-970-GAMING-4G.html#hero-specification , presumably the measurement was taken in silent mode.

If you want the most power efficient after-market 970/980 card, Asus Strix is the answer.



Nearly everyone else pushed for performance and max overclocking over efficiency. Then again I don't understand at all why anyone would pay $550 for an Asus Strix 980 to save on power usage given how quickly flagship cards depreciate.

It is amazing how important measuring power has become in these forums. I'm going to head over to the Ferrari foums and see how many people are bitching about fuel consumption.

Is this the HTPC section?

I've noticed the changed and it all started after Kepler. Not sure if it's NV's brand marketing that's having such a huge effect on performance/watt or people's desire to have a powerful PC in a mini-ITX case to compare against PS4, but I will say right now I'll take performance over power usage nearly 90% of the time.

Give me more CPU cores, more IPC, more CUDA/Stream processors, more power!!


What's strange to me if is efficiency and performance/watt are now the leading factors, why aren't we ditching PCs for consoles? 1 or 2 node shrinks on PS4 and PC will have no chance of competing in performance/watt in cross-platform games.



All I know is once 250-300W GM200/390X or Pascal drop, people will be salivating and lining up because PC = performance platform. It seems to me right now gamers just want to justify why 970/980 are worth it to upgrade for since they hardly brought a performance change from 290X/780Ti year old cards. So there is really nothing to talk about since DSR is now available on Kepler, DX12 and MFAA are MIA.
 
Last edited:

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
It is amazing how important measuring power has become in these forums. I'm going to head over to the Ferrari foums and see how many people are bitching about fuel consumption.

Is this the HTPC section?

Sorry but I don't think you get it. Nvidia claimed low power consumption as their main marketing strategy. After couple of weeks, we find out it's not true.

On the other side, I don't think Ferrari said they were going to sell a low energy v12 that consume 5L/100km.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Sorry but I don't think you get it. Nvidia claimed low power consumption as their main marketing strategy. After couple of weeks, we find out it's not true.

On the other side, I don't think Ferrari said they were going to sell a low energy v12 that consume 5L/100km.

I wasn't just making the comment in regards to the 970.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I've noticed the changed and it all started after Kepler. Not sure if it's NV's brand marketing that's having such a huge effect on performance/watt

Imho,

I don't know about you but this really started with the weak performance/watt of the first iterations of Fermi to me. My personal constructive nit-pick was performance/watt and for nVidia to compete they needed to be more efficient. The reference designs were loud and hot and had to wait a bit for AIB differentiation and more robust cooling.

It's clear that nVidia was proud of their efficiency and marketed strongly with Kepler and the third party sites also discussed efficiency. The same can be said for Maxwell.

I don't understand why performance/watt isn't important to some level based on efficient chips tend to be cooler and quiet. It's so important that third party reviews discuss power and investigate it. Performance/watt may not be as important as raw performance or performance/dollar but still an important metric and may be the most important metric to gauge architectures.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Funny how KarLitos's original post that the R9 290x could "morph" into another red team green team battle!

To comment on the OPs post, I'm glad I waited till I could pick up my Sapphire Tri-X 290s for @$290 each. Watercooling really helps the temps.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't understand why performance/watt isn't important to some level based on efficient chips tend to be cooler and quiet. It's so important that third party reviews discuss power and investigate it. Performance/watt may not be as important as raw performance or performance/dollar but still an important metric and may be the most important metric to gauge architectures.

Depends whose perspective we are looking at. For example, AMD has limited financial resources and has never made a 500mm2 GPU die. For them, performance/mm2 and compute per mm2 is more important because they have not been able to fab 520-560mm2 GPU dies up to this point. Naturally if you cannot afford a huge wide chip, you have to pack your transistors a lot more closely which has a negative effect of higher power consumption.

As far as the importance of performance/watt goes, it is very important for mobile and as a gauge into how fast the flagship 250-275W cards will be. However, if I am asked to pay $550-600 for a "marketing flagship" mid-range card and the main selling point is performance/watt, I laugh at that marketing spin all day. Why? As I said if I wanted a performance/watt, I'd get a Wii U or PS4. Or you know what a Core i5 T series and a GTX750Ti. I would never overclock any CPU and GPU and would never game on a 37" LCD or 60" Plasma because once those are taken into account the performance/watt for my gaming hobby is going to tank hard. I should downgrade to a 17 inch LCD that uses 50W of power. :biggrin:

Anyway, going to back to the comparison of 970 SLI vs. 290s, let's say the former setup uses 100W or even 150W of power less (and reviews have shown it to be less when looking at after-market cards), why should I care more about using less electricity or marketing performance/watt metric when I have to pay $150-200 more to achieve those efficiency gains? In that case, why even overclock my CPU as that completely destroys performance/watt too. I guess we are all wasting $ on those K series Intel i5/i7s (see my chart above of what happens once you start overclocking Intel CPUs with more voltage).

As any person in finance will tell you, prove to me the rate of return on that purchase is worth the electricity cost or performance/watt savings. If you cannot, performance/$ wins if you take emotions out of the equation. That's why a lot of people find the performance/watt as a key metric to be a false notion because usually you are asked to pay an exponentially more to achieve those efficiency gains (for example $550 680 4GB vs. $450 7970Ghz or $370 GTX780Ti vs. $550 980). The lack of sufficient rate of return is similar why electric cars are also failing to take off at the moment, and why not everyone thinks the additional costs of diesel engines in North America are worth the time it takes to break-even on that extra cash upfront. To make sense of performance/watt in environments where you are not really that power constrained (desktop PC gaming), you have to account for what it actually costs you to get those efficiency gains. This 2nd part of the equation which considers costs is generally ignored by NV and NV users as if it doesn't even exist because...?

Why don't we have a graph that depicts performance/watt relative to $ value? That's really way more valuable because you wouldn't buy a GTX980 that used 1W of power for $999, would you?

Further, if you are talking about a desktop PC gaming powerhouse rig which has dual or triple GPUs, it's already using 400-500W, what difference does it make if the cheaper system uses 550-650W? Even though in the summer time it requires more AC to cool your gaming room, in the winter it requires less heat. It goes both ways. Even when discussing single GPUs, you now have to pay almost $100 more for a 970 which will take way too long to recoup the costs in saved electricity.

---

And BTW, as you recall from various discussions, when AMD had superior performance/watt, going back to HD4870 and 5870 and 6970, NV still maintained its market share lead until today despite losing some market share due to failing to get Fermi out on time. Even after they refreshed GTX570/580, they still trailed AMD in performance/watt and well were not even on the map during HD7950/7870/7970/7850/7770 period until they rolled out the entire Kepler line-up, which took them 9 months! During that period of time there was no sudden wave of NV user migration to AMD which proves that performance/watt is NOT the key driving metric of GPU sales on the desktop. Some people still bought GTX285 for 6 months when 5850/5870 were around and GTX570/580 when HD7950/7870/7970 launched because of other factors such as features, brand loyalty, etc.

Ask yourself this question, what if a hypothetical NV GM200 was 50% faster than GTX980 and used 275W you think people would not buy GM200 because it used "too much power" vs. the 980? Alternatively if AMD released a 390X that used 200W and was 50% faster than 980 but costs $999, would people buy it? You should not ignore the additional finances required to achieve specific efficiency gains.
 
Last edited:

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Most reviewers have failed the consumers and should be ashamed of themselves by pitting reference 947mhz R9 290 cards that run hot and loud and use way too much power (because they run hot and there a relationship between heat and power usage of an ASIC). Of course if your job is to sell the latest and greatest product and get a kick back from positive reviews, I can understand. But why have most reviewers, unlike TechSpot, failed to revisit the price/performance of R9 290s vs. 970 SLI 1 month after launch due to price adjustments?

BTW, most reviewers also failed to revisit after-market 7950 V2, 7970Ghz editions against 670/680. Why is that?

At $230-250, an after-market R9 290 is a bargain compared to 970/980.



HardOCP confirms that GTX970 does not have power load balancing of 980 reference cards. That's why after-market 970s cannot claim the efficiency of 980 in reviews. ** The chart below uses reference R9 290s which again is per usual the typical practice of reviewers....yawn.


Agreed.

It is in those situations where you realize that Nvidia lying marketing strategies work with consumers.

They make you believe something, then people buy in mass = BIG WIN for them.

It's true that the GTX 970 is efficient but far from what they claim.

Nvidia works well to make their strengths look strengtheners.
Better temperatures lead to better reviews and better power consumption, and thats why they put a 250w cooler(Titan cooler) on the GTX980&GTX970 cards. The cooler maintaining the temps low certainly made Maxwell cards consume less.








EDIT3: Look a Tri-X R9-290x 1060Mhz GPU-z pointing 263w peak power consumption on DC while gaming: http://adrenaline.uol.com.br/forum/showthread.php?t=492799&p=1070430214#post1070430214 (translate from protuguese-brazil if you want)

he have no data but is a friend of mine, reputable source.

He also says 234W peak at stock Tri-X settings on Ryse:son of Rome: http://adrenaline.uol.com.br/forum/showthread.php?t=492799&p=1070430128#post1070430128

According to him, well cooled stock Hawaii do not consumes so bad(No way consumes better than a GK110 or a GM204 chip), but on over the energy used goes up very much.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
I think a very good buy at the moment are the 280x used from ebay,I picked one up myself cheap and still under warranty ,good upgrade over my MSI 560Ti card,also Nvidia 970 coil whine issues put me off that so decided to wait until next year for R9 390s and rest of Nvidia's new range.

Thread on coil whine here at OCUK,http://forums.overclockers.co.uk/showthread.php?t=18630402


Btw happy with my Sapphire 280x dual X OC card,no drivers issues especially going from Nvidia to AMD .
eheh, I sold my 3 months old 280x I got off of ebay. I was planing on buying another for cf. but then 290 price dropped! sold it for 150(local pick up so all of 150$ ) spent 135 more and got a 290 tri x. kinda wished I had waited a bit more. it was literally a week later that 290 pcs+ price dropped to 260 I am literally laughing at all the 970 buyers
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Ask yourself this question, what if a hypothetical NV GM200 was 50% faster than GTX980 and used 275W you think people would not buy GM200 because it used "too much power" vs. the 980? Alternatively if AMD released a 390X that used 200W and was 50% faster than 980 but costs $999, would people buy it? You should not ignore the additional finances required to achieve specific efficiency gains.

It's not about 250w vs 190w but about performance/watt used. For example: the GTX 780 was much more powerful and used more watts but was just as efficient as the GTX 680.

nVidia has come a long way from the first iterations of Fermi.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I hope that all you power consumption guru's are running a frame limiter too because if you aren't then you are missing out on a ton more power savings there.

A frame cap at your refresh rate, esp. at 60hz is a god send for keeping gpu temps and power down.

My rig barely ever exceeds 465 watts unless I am at super demanding parts of a game then the card just dynamically clocks up to whatever and powers through until it doesn't need to run so hard.

Also, running with a frame cap makes even single gpu gaming a very smooth experience.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Indeed! It may sound odd to some degree but frame limiters are very welcomed - especially for multi-gpu!
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
To make sense of performance/watt in environments where you are not really that power constrained (desktop PC gaming), you have to account for what it actually costs you to get those efficiency gains. This 2nd part of the equation which considers costs is generally ignored by NV and NV users as if it doesn't even exist because...?

Your whole posting doesnt make any sense.
Nobody would buy a 290(x) card for the same price like a GTX970/GTX980. That's the simple reason why AMD cut their prices. Better Perf/Watt ratio leads to this.

They are not competitve from a hardware perspective. So they undercut nVidia with much lower prices. Maybe next time just appreciate competition.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I hope that all you power consumption guru's are running a frame limiter too because if you aren't then you are missing out on a ton more power savings there.

A frame cap at your refresh rate, esp. at 60hz is a god send for keeping gpu temps and power down.

My rig barely ever exceeds 465 watts unless I am at super demanding parts of a game then the card just dynamically clocks up to whatever and powers through until it doesn't need to run so hard.

Also, running with a frame cap makes even single gpu gaming a very smooth experience.

in reality they are likely bumping voltage and O/C'ing them as far as they can.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |