Nvidia are on a roll

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_RedStar

Senior member
Mar 6, 2005
526
0
0
"...In my opinion you're somewhat silly to go out and grab a G80 *right now* if you've already got an extremely competent card... " --dug777

I agree. But i think you are missing the point. Only around 5% of gamers have those competent cards. Others, have waited (perhaps too long) and can now go for the g80.

Don't make me pull out the valve survey )
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: RedStar
Yes they are on a roll:

http://www.tgdaily.com/2006/11/10/nvidia_q3_2007/

a nice surge in profits. Profits mean better R&D

That's all from G7x. Nvidia had a very efficient architecture transistor wise, and they lined their pockets with the profits. G80, judging by the die alone, will be much more expensive to manufacture.

At least ATI has been so kind as to give Nvidia a grace period to earn some nice profits with G80. Once R600 comes out, pricing should get more competitive.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: jiffylube1024


That's all from G7x. Nvidia had a very efficient architecture transistor wise, and they lined their pockets with the profits. G80, judging by the die alone, will be much more expensive to manufacture.

At least ATI has been so kind as to give Nvidia a grace period to earn some nice profits with G80. Once R600 comes out, pricing should get more competitive.

Yes, must be very costly to make the G80 at current 90nm process. I would imagine they are working very fast and hard to shrink to 65nm. The sooner they accomplish this shrink, the better.

Prediction: G81. (Both GTS and GTX) Will debut around the time R600 hits. Right around the time we should see 8300/8600's. Will be 65nm, 512bit bus with a GB of GDDR4. Core speed 650, mem speed miniimum 2000MHz. I only say 512bit bus because I have read that G80 was designed with 512bit bus in mind, but just not implemented yet to keep the cost down on an already expensive 90nm based card.

 
Jun 14, 2003
10,442
0
0
Originally posted by: dug777
Originally posted by: otispunkmeyer
Originally posted by: tuteja1986
Originally posted by: Gstanfor
you're on crack paying top dollar for the card now
No more so than those who bought GF4 Ti4600, 9700 Pro, Geforce 6800 Ultra, 7900 GTX for top dollar when they were new.

Now for an opinion I know will be viewed controversially (but do I care?)
nvidia has never required competition in order to drive them forward IMHO. G80 is proof of this. In fact, as demonstrated with nv3x and nv4x, competition hurts nvidia's ability to deliver the very best quality possible to consumers (I'll guarantee nvidia would never have introduced harmful optimizations the way they did if they were not forced to compete hard against ATi).

What are you smoking ;( ... I don't what goes inside your brain but here is an advice : be more open minded. Also competition is reason why G80 is so great.


competition drives everything, but G80s been on the drawing board for a long time.... so what were they designing this to compete against 4 years ago?

I don't know...maybe the equivalent ATI card at the time?

OF course competition drives the video card market, like it drives every other market, and you really do have to smoking crack to think otherwise on this one


well the living room was full of a distinctly non-tobacco smoke today guess jim (house mate) got hold of some of the "plant of fun", probably explains the wheelie bin and the for sale sign in the front garden too
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
If Nvidia has taught me anything about the g80 launch, it's not to trust what they say they're gonna do or how they will do it with regard to upcoming products. All this time they've been downplaying the importance of unified shaders and HDR+AA, while working hard to implement exactly those features. So if Jen-Hsun says he's not working on a cpu, you can bet your @$$ he really is working on a cpu.

Also, I dont expect NV to pull any magic tricks for the r600 launch, like moving to a 512 bit bus or adding extra shaders. If the g70-g71 transition has shown me anything, it's that Nv will try to maximize profit margins, even if it means giving up some performance/features lead to the competition. They can probably make up the difference in marketing anyways. And face it, the r580 was and is faster than the single-gpu g71 cards, it's not as dead even as some people would like to make it seem. There were rumors and expectations that a 32-pipe monster g71 would crush the competition, but I said it wouldn't happen, it did not happen, and I'm saying the same thing will not happen this time either.

What's more interesting is that we still don't know how the g80 performs in DX10 games, and maybe it wont even matter since the games just keep getting pushed back, but that certainly remains one aspect of the card about which we do not know anything.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: munky
If Nvidia has taught me anything about the g80 launch, it's not to trust what they say they're gonna do or how they will do it with regard to upcoming products. All this time they've been downplaying the importance of unified shaders and HDR+AA, while working hard to implement exactly those features. So if Jen-Hsun says he's not working on a cpu, you can bet your @$$ he really is working on a cpu.

Also, I dont expect NV to pull any magic tricks for the r600 launch, like moving to a 512 bit bus or adding extra shaders. If the g70-g71 transition has shown me anything, it's that Nv will try to maximize profit margins, even if it means giving up some performance/features lead to the competition. They can probably make up the difference in marketing anyways. And face it, the r580 was and is faster than the single-gpu g71 cards, it's not as dead even as some people would like to make it seem. There were rumors and expectations that a 32-pipe monster g71 would crush the competition, but I said it wouldn't happen, it did not happen, and I'm saying the same thing will not happen this time either.

What's more interesting is that we still don't know how the g80 performs in DX10 games, and maybe it wont even matter since the games just keep getting pushed back, but that certainly remains one aspect of the card about which we do not know anything.


Would be very kewl to see some pure DX10 demo being run on G80.
 
Jun 14, 2003
10,442
0
0
Originally posted by: dguy6789
Originally posted by: keysplayr2003
Nvidia just turned out the nicest piece of graphics hardware ever seen. Something we all wanted and was the source of ATI/Nvidia flame wars since forever, was an extremely powerful card that also offered damn near perfect image quality, AND the ability to crank the image quality without losing playability. Nvidia has done well here.

Those who can afford it and want it, then buy it. Those that can't afford it, can't buy it and make it sound like it's stupid to buy it now and think up reasons why it's not necessary to buy one. It's actually pretty hysterical to watch all this.

7900GTX/X1950XTX are great, powerful cards, but nobody can sit there and tell me that they can play ever game they own, at highest resolution their monitor will support, with every setting maxxed, 16xAA 16xAF and still be playable. G80 seems to offer this kind of power. No more compromising your settings.

It just appears to me that people just complain no matter what the circumstances are. Usually the complaints are because they can never afford this kind of purchase. Hey, I'm one of the folks who can't afford a G80, but damned if I cant see the merits of owning one.

So, everybody. Don't be idiots.

Indeed, the Geforce 8800 allows users to run at the settings you stated, but it will not be for long; newer and more advanced games are coming out all the time. Although there is something that needs to be said. As you all know, the 8800 has unified shaders. DX9 does not support this. We will have to wait until DX10 to see the true effect of having them all fully programmable.


yeah but thats the same case for every GPU ever, theres always the next best piece of software and hardware round the corner

if you worry too much about going forward you'll go backward just as quick.

theres really no time like the present to buy stuff (unless theres an impending launch like next week)

i could wait for more quad cores, bigger brighter finer displays, 65nm fabbed G81's, bigger hard drives, bigger PSU's with better efficiency.... etc and by the time ive waited for them, there'll be 8 core cpus' and 10bit LCD's and even faster more powerful GPU's on the horizon

ill be waiting forever, and while im waiting my current hardware aint getting any younger.

so yeah, if your gonna buy, just buy it now. its just a fact of this industry that todays top GPU probably wont be able to handle tomorrows top end game, you just gotta bite the bullit and get on with it and be happy (for a while )

those who can afford it can pump up their e-penii and those who cant moan about it.

still there are finer things in life than computer hardware

dug for instance likes to spend all his money on good beer rather than his computer
 
Jun 14, 2003
10,442
0
0
Originally posted by: keysplayr2003
Originally posted by: jiffylube1024


That's all from G7x. Nvidia had a very efficient architecture transistor wise, and they lined their pockets with the profits. G80, judging by the die alone, will be much more expensive to manufacture.

At least ATI has been so kind as to give Nvidia a grace period to earn some nice profits with G80. Once R600 comes out, pricing should get more competitive.

Yes, must be very costly to make the G80 at current 90nm process. I would imagine they are working very fast and hard to shrink to 65nm. The sooner they accomplish this shrink, the better.

Prediction: G81. (Both GTS and GTX) Will debut around the time R600 hits. Right around the time we should see 8300/8600's. Will be 65nm, 512bit bus with a GB of GDDR4. Core speed 650, mem speed miniimum 2000MHz. I only say 512bit bus because I have read that G80 was designed with 512bit bus in mind, but just not implemented yet to keep the cost down on an already expensive 90nm based card.


same rumour ive read, but i cant remember who said it lol
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
Yep, Nvidia is doing very nicely as of late. Their 680i chipset kicks some serious butt, as does the G80, which is an amazing card. It also turned out to be 90nm just like I figured. It should have lots of room for improvement if they can lower the manufacturing process on that card. I don't think we'll see a 512bit memory interface along with that shrink, though. I'd love to be wrong.

One thing I'm curious of is regarding their chipsets...I've seen very few reviews on the 650SLI/Ultra. Most are on the 680SLI. I'm curious if the cheaper chipset can overclock as well as its big brother, since Nvidia is advertising the 680i as the overclocker of the bunch.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Avalon
Yep, Nvidia is doing very nicely as of late. Their 680i chipset kicks some serious butt, as does the G80, which is an amazing card. It also turned out to be 90nm just like I figured. It should have lots of room for improvement if they can lower the manufacturing process on that card. I don't think we'll see a 512bit memory interface along with that shrink, though. I'd love to be wrong.

One thing I'm curious of is regarding their chipsets...I've seen very few reviews on the 650SLI/Ultra. Most are on the 680SLI. I'm curious if the cheaper chipset can overclock as well as its big brother, since Nvidia is advertising the 680i as the overclocker of the bunch.


the other thing as well is the 680 is the only one to offer 2 PCI-E x16 that actually have 16 lanes each when used in SLI, with a third x16 port using only 8 lanes (for the physics processing)

the 650 has 2 x16 ports, but when in SLI they have 8 lanes each

i cant remember the exact thing but im sure i read that the 8800GTX sli actually needed 2 pci-e x16 slots with 16lanes each to work

can some one clear this up?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Wreckage
G80+680i+Kentsfield


Ahh..... One can dream. Just bought a house, so all I can do for a while is watch others have all the fun!
 

dunno99

Member
Jul 15, 2005
145
0
0
I don't get why some people keep on saying that it's pointless to get the GF8800 because DX10 games aren't out yet. That's just like saying it's pointless to get a good car just because you aren't (legally) allowed to drive faster than 65mph/100kmph. For one, most people aren't getting the good car just for the faster engine. They might want it for the better handling, safety reasons, or better acceleration. Same thing as the GF8800, where users may want it for not having to deal with setting up SLI/XFire (analogous to better handling), or having to disable it to improve performance (safety), or just being able to run DX9c games 50% faster (better acceleration).

So for those few of you dwelling on why there's no point to getting a GF8800 because no games can use it...well, stop it. Besides, Crysis is coming out Q1 next year (not saying that everyone will play this game...but if you're the strategy type, you might want to consider Supreme Commander, which will probably need all that graphics power for dual monitor action)...and you can either buy a GF8800 now and enjoy the benefits immediately, or you can buy it in 3 months for maybe $10 cheaper and enjoy the savings.

(I'm not going to comment on ATi's offerings, since that's not here, and we don't know about the pricing/availability of the cards at/around launch)
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: tuteja1986

Ohh god so now we buy these expensive GPU for :?

replacing any single card setup that isn't a 7900/1900. It has better IQ + performance + upgradability.

replacing sli/cf setup w. better IQ + compatibility while getting a few bucks back.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Beachboy
Nvidia is hardly the small guy but they are smacking ATI like Intel smacked AMD.

They are always trading places.. like how 9xxx kicked all fx5xxx series , then NV came back w. 6xxx, then ATI regained the crown w. x800 series, NV 7900, then ATI1900, now G80...

I'm sure R600 will be >= G80, else ATI/AMD will be big trouble..
can't lose 2 in a row.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: tuteja1986
Where are the game , its an nice E penis extension... Spend $600 to play what ? ... Nvidia ain't small guys they are big. Nvidia ain't the 1st company to introduce unified architecture , it was ATI's GPU in the XBOX 360. Nvidia last year was so verbally against the unified architecture. Then suddenly the bashing from their head engineering about Unified architecture stopped arround feb 06.

Nvidia was against Unified because they threw you a curveball and kept everyone guessing.

I wouldnt doubt if Nvidia themselves leaked the 30% faster benchmarks.

I would rather spend $600 on this GPU than spend $600 on R580 like a LOT of people did.

G80 gives gains that R580 could have only dreamed of on its launch day. Insane resolutions at insane IQ levels, makes R580 looks mediocre at best, even in its hay-day
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Originally posted by: otispunkmeyer
Originally posted by: tuteja1986
Where are the game , its an nice E penis extension... Spend $600 to play what ? ... Nvidia ain't small guys they are big.

who cares about the games?

Who cares abbout the games?? This man is crazy.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: keysplayr2003
Originally posted by: Wreckage
G80+680i+Kentsfield


Ahh..... One can dream. Just bought a house, so all I can do for a while is watch others have all the fun!

I know your pain. I will probably be getting GTS+650i+E6600. Should cost more than half what a GTX+680i+X6800 costs. I also think I can wait a few more months so I can hopefully take advantage of some price drops.
 

Conky

Lifer
May 9, 2001
10,709
0
0
These threads always make me laugh because the arguments haven't changed in years. :laugh:

The new cards are great if you are willing to part with that much cash for bragging rights. Just like buying a TNT would've been for the 32-bit color bragging rights when a Voodoo2 was still faster and no games were coded for 32-bit color yet.

Buying current videocards for anticipated future use is and always has been retarded and is a result of pure marketing.

A prudent buyer buys for what uses his card for today. Personally, I am maxed at 1280x1024 and am not a fan of AA so I am able to run everything at MAX settings in every single game I play(and, more importantly, every game I want to play) with no slowdowns or issues related to speed/visuals. In other words, a new 8800GTX, as cool as it would be to have, would do me no good right now. I need to work on getting a 24" or, better yet, a 30" monitor and THEN the new card would make start to make sense for me.

My biggest fear is that the reason the new card is so fast is that Vista is such a dog that it's gonna take equipment like the 8800 to run it and if that's the case I will stick with XP and/or Linux. All the good games eventually end up coded for Linux anyway.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Gothic 3 works just fine at 1600x1200 8xAA 16xAF @ X1900XTX 512MB / 7900GTX.
That's strange. From what I hear Gothic 3 is more demanding than Oblivion.

NWN2 has some decent graphics when they're maxed out and the shadows are given a rescaled mipmap of 4096 or 2048. However, when that happens it brings my system to almost a haulting stop. I've had 5-10 fps in foliage with maxed settings.

However, rather than throw uber high hardware at the problem I'd much rather see Obsidian/Atari get the kinks out of that engine and let it perform better.
Originally posted by: tuteja1986
Originally posted by: Gstanfor
you're on crack paying top dollar for the card now
No more so than those who bought GF4 Ti4600, 9700 Pro, Geforce 6800 Ultra, 7900 GTX for top dollar when they were new.

Now for an opinion I know will be viewed controversially (but do I care?)
nvidia has never required competition in order to drive them forward IMHO. G80 is proof of this. In fact, as demonstrated with nv3x and nv4x, competition hurts nvidia's ability to deliver the very best quality possible to consumers (I'll guarantee nvidia would never have introduced harmful optimizations the way they did if they were not forced to compete hard against ATi).

What are you smoking ;( ... I don't what goes inside your brain but here is an advice : be more open minded. Also competition is reason why G80 is so great.
QFT.
Originally posted by: Gstanfor
nvidia were first with a single main chip video card as powerful as the competition (Riva 128), first to useable 32 bit color (TNT), first to offload the graphics pipeline from the cpu (GF256), first to introduce a crossbar memory controller (GF3), first to introduce pixel & vertex shaders (GF3) ... the list goes on and on and on, all the while offering performance equal or better than anything else out there, and they are continuing to do so to this day, with nv3x being the only real stumbling block.
Ever think that they made those innovations because of competition?
If gaming is such an important part of people's lives that they're unwilling to wait a few months to see how things pan out (and you wouldn't exactly be suffering in the mean time if you had a 'last gen' ultra high end card/cards from the benchmarks i've seen, then getting G80 *right now* makes sense for them.
I agree, I'm just saying that other members buy it because they either want to or their not happy with their current rigs or both. Heck some do it for the improved visual quality without the performance hit. That probably sums up the enticing G80 package in one sentence: Well performing, unparalleled image quality.
In my opinion you're somewhat silly to go out and grab a G80 *right now* if you've already got an extremely competent card in there already, when it doesn't take a rocket scientist to see that you could save yourself a lot of money by waiting a month or so for prices to fall, but at the end of the day, i'm not forcing you to agree with me, or discounting the validity of your opinion
Yeah, I'm not going to dive into G80 yet. If I went EVGA and wanted a step-up path, I don't think that the G81 is going to hit soon enough. I'll wait to see how R600 stacks up against it and competitive prices drag it down. I need to move to Conroe/Kentsfield too so I'm not in a rush for a video card really.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: josh6079
I agree, I'm just saying that other members buy it because they either want to or their not happy with their current rigs or both. Heck some do it for the improved visual quality without the performance hit. That probably sums up the enticing G80 package in one sentence: Well performing, unparalleled image quality.

QFT

This is exactly why people are upgrading and this is why I want to upgrade, but wont cause my GF made me buy us a new TV on Sunday.

People recommending an X1900XT right now are crazy, unless you're going for a mid range card.

Anyone who doesnt think the IQ and excessive horsepower are necessary... I say, you're missing out on some awesome IQ.

Even in GRAW, I only manage 55fps avg at 1680x1050 settings maxed, 16xHQAF and GRAW doesnt even do AA.

A high power system with a high power display, there is no other choice than a GTS or GTX, unless you want to find yourself in the same quandry 3 months from now.

 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: tuteja1986
Originally posted by: josh6079
...you're on crack paying top dollar for the card now especially if you've already got an x19x0xt/x/79x0/SLI/CF in there...
Why? CrossFire and SLI both cause a hoard of problems depending on the game and you'd get better performance on the average without the dual-GPU problems. Not to mention it spanks an R580.

People say there are no demanding titles out, but just because there isn't a popular shooter that is making it cry doesn't mean that there isn't a game that could use its power. NWN2 and Gothic 3 could definitely benefit from a G80, and I don't know about you, but I'd like being able to handle any game I have with the highest possible settings they offer and still have room for games that are due to come. Who wouldn't want one right now?

NWN 2 your kidding ;( the graphic in that rpg are terrible , they look 2 years old. Gothic 3 works just fine at 1600x1200 8xAA 16xAF @ X1900XTX 512MB / 7900GTX.

NWN2 is very high end...

The models arent as complex as say oblivion, but thats because of the customization layer built into the game.

How well would G80 push a scene with 80 objects all needing pixel shading?
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Acanthus
Originally posted by: tuteja1986
Originally posted by: josh6079
...you're on crack paying top dollar for the card now especially if you've already got an x19x0xt/x/79x0/SLI/CF in there...
Why? CrossFire and SLI both cause a hoard of problems depending on the game and you'd get better performance on the average without the dual-GPU problems. Not to mention it spanks an R580.

People say there are no demanding titles out, but just because there isn't a popular shooter that is making it cry doesn't mean that there isn't a game that could use its power. NWN2 and Gothic 3 could definitely benefit from a G80, and I don't know about you, but I'd like being able to handle any game I have with the highest possible settings they offer and still have room for games that are due to come. Who wouldn't want one right now?

NWN 2 your kidding ;( the graphic in that rpg are terrible , they look 2 years old. Gothic 3 works just fine at 1600x1200 8xAA 16xAF @ X1900XTX 512MB / 7900GTX.

NWN2 is very high end...

The models arent as complex as say oblivion, but thats because of the customization layer built into the game.

How well would G80 push a scene with 80 objects all needing pixel shading?

Probably about 5x better than anything else currently on the market.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Matt2
Originally posted by: Acanthus
Originally posted by: tuteja1986
Originally posted by: josh6079
...you're on crack paying top dollar for the card now especially if you've already got an x19x0xt/x/79x0/SLI/CF in there...
Why? CrossFire and SLI both cause a hoard of problems depending on the game and you'd get better performance on the average without the dual-GPU problems. Not to mention it spanks an R580.

People say there are no demanding titles out, but just because there isn't a popular shooter that is making it cry doesn't mean that there isn't a game that could use its power. NWN2 and Gothic 3 could definitely benefit from a G80, and I don't know about you, but I'd like being able to handle any game I have with the highest possible settings they offer and still have room for games that are due to come. Who wouldn't want one right now?

NWN 2 your kidding ;( the graphic in that rpg are terrible , they look 2 years old. Gothic 3 works just fine at 1600x1200 8xAA 16xAF @ X1900XTX 512MB / 7900GTX.

NWN2 is very high end...

The models arent as complex as say oblivion, but thats because of the customization layer built into the game.

How well would G80 push a scene with 80 objects all needing pixel shading?

Probably about 5x better than anything else currently on the market.

Yes, but if the detail level were "oblivion like" youd be seeing 5fps instead 1fps.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |