Anandtech Reviews R600!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Extelleron
Originally posted by: spittledip
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.

Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

The two things that I think are causing the HD 2900XT to not perform as well as it should are a) only 16 TMU's, and b) immature drivers (especially with AA performance). The majority of games today are still very texture dependant and the 8800GTX with 32 TMU's has a huge advantage over the HD 2900XT's 16. This is one of the reasons, along with drivers, the 2900XT doesn't see a huge advantage over the X1950XTX in some games - it only has slightly more texture power from the higher clockspeed (750MHz vs 650MHz). "Next-gen" games like Company of Heroes and those based on the Unreal 3 engine are much more shader dependant, and here the HD 2900XT shines.

The 2900 takes a major beating when AA is applied. So since it loses some benchmarks to the 8800 plus the power\heat\noise issues, the GTS is a better card even at the same price or a little higher for that matter.
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: Extelleron
Originally posted by: Tridus
Originally posted by: Extelleron
[Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

Right now its $70 less for similar performance. That alone is reason enough.

Power is also an issue. the 2900XT draws insane amounts of power for the performance, far more then the 8800GTS. Power costs money, both in the power itself and in a PSU that can feed it (not a problem if you bought a 600w PSU, but lots of us don't have those). Plus, all that extra power gets converted into extra heat.

There isn't much to recommend the 2900XT against a cheaper 8800GTS.

The most advanced games featured in the test suites (CoH as one of them, Rainbow Six another which is based on the UE3 engine) run very well on the HD 2900XT. In CoH the 2900XT is significantly faster than the GTS, and with 2048x1536 4x/8x settings in Vista beats the GTX. In Rainbow Six the 2900XT is faster than an overclocked GTX. (In one review at least, Anandtech shows the GTX and XT with equal performance. Perhaps legitreviews used a newer driver)

COH is the 2900xt's premiere game (benchmark wise). What about all the other game benchmarks it falls flat on it's face on, including being beat by GTS320.

Your being a bit one sided where the choice on performance is not so clear. Other reviews show their is large discrepency in benchmark performance under different games and settings, so trying to say that 2900xt supersedes the 8800gts is just plain not true.
But for COH, as your example does, so if that is your game then great.
But what if it's Stalker, Battlefield2 or Oblivion at 4xAA where the GTS wins, even the 320 most of the time? I guess you have to call a spade a spade.
 

jaykishankrk

Senior member
Dec 11, 2006
204
0
71
How did they Bench for a crossfire setup when it is clearly mentioned that they have used only Intel setup in their TEST BED.. we need to know the AMD setup for test bed, so that the figures indicated are True. some may argue that the Test setup holds nothing against Game Performance, but i disagree with that conception.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Wreckage
Originally posted by: Extelleron
Originally posted by: spittledip
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.

Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

The two things that I think are causing the HD 2900XT to not perform as well as it should are a) only 16 TMU's, and b) immature drivers (especially with AA performance). The majority of games today are still very texture dependant and the 8800GTX with 32 TMU's has a huge advantage over the HD 2900XT's 16. This is one of the reasons, along with drivers, the 2900XT doesn't see a huge advantage over the X1950XTX in some games - it only has slightly more texture power from the higher clockspeed (750MHz vs 650MHz). "Next-gen" games like Company of Heroes and those based on the Unreal 3 engine are much more shader dependant, and here the HD 2900XT shines.

The 2900 takes a major beating when AA is applied. So since it loses some benchmarks to the 8800 plus the power\heat\noise issues, the GTS is a better card even at the same price or a little higher for that matter.

Mainly because of drivers. There's alot of weird, weird problems with the XT on these current drivers. Inquirer's test shows higher performance at 2560x1600 than 1920x1200 in several tests; at 2560x1600 with AA/AF the XT really encroaches on the oc'd GTX (626/2000) and the overclocked XT beats it several times. FEAR 2560x1600 16xAA/16xAF there's no competition, the XT crushes the oc'd GTX (which is as fast as the Ultra).

 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: Extelleron
The most advanced games featured in the test suites (CoH as one of them, Rainbow Six another which is based on the UE3 engine) run very well on the HD 2900XT. In CoH the 2900XT is significantly faster than the GTS, and with 2048x1536 4x/8x settings in Vista beats the GTX. In Rainbow Six the 2900XT is faster than an overclocked GTX. (In one review at least, Anandtech shows the GTX and XT with equal performance. Perhaps legitreviews used a newer driver)

So in other words, you want us to buy the 2900XT for the crappy console port that is R6V because nothing will change and all UE3 games will favor it?
 

Zenoth

Diamond Member
Jan 29, 2005
5,196
197
106
Originally posted by: Extelleron
Originally posted by: spittledip
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.

Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

The two things that I think are causing the HD 2900XT to not perform as well as it should are a) only 16 TMU's, and b) immature drivers (especially with AA performance). The majority of games today are still very texture dependent and the 8800GTX with 32 TMU's has a huge advantage over the HD 2900XT's 16. This is one of the reasons, along with drivers, the 2900XT doesn't see a huge advantage over the X1950XTX in some games - it only has slightly more texture power from the higher clockspeed (750MHz vs 650MHz). "Next-gen" games like Company of Heroes and those based on the Unreal 3 engine are much more shader dependent, and here the HD 2900XT shines.

R600 has inferior Anisotropic-Filtering, and that is hardware-based, it won't change through driver updates.

R600 has inferior power-consumption efficiency for its given performance than the GTS and GTX.

R600 is still at the moment more expansive than a GTS 640MB at some places, and it is now that is important. Being less expansive "soon" isn't relevant, when you launch a product you want people to be interested in it when it gets shown, not a month of two later.

Let me tell you this, I would have been the first to write something against G80, I like ATi, I bought six cards from them in total since the past four years, but this time I see no real advantages to the HD 2900XT, to take back your words.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Aikouka
Originally posted by: Extelleron
The most advanced games featured in the test suites (CoH as one of them, Rainbow Six another which is based on the UE3 engine) run very well on the HD 2900XT. In CoH the 2900XT is significantly faster than the GTS, and with 2048x1536 4x/8x settings in Vista beats the GTX. In Rainbow Six the 2900XT is faster than an overclocked GTX. (In one review at least, Anandtech shows the GTX and XT with equal performance. Perhaps legitreviews used a newer driver)

So in other words, you want us to buy the 2900XT for the crappy console port that is R6V because nothing will change and all UE3 games will favor it?

Find me another game on the UE3 engine that we can benchmark? Chances are if a card is strong at one game, it will be strong at another game using the same engine. nVidia used to destroy ATI in Doom 3, Quake 4, etc... because they all used the same engine and ATI sucked at it (well, at OpenGL in general). I certainly hope UT2007 is better implemented than Rainbow Six, especially with regards to use of HDR + AA.
 

Nnyan

Senior member
May 30, 2003
239
1
76
After AT and HO I liked the summery of TechReport:

http://www.techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=16

"Ultimately, though, we can't overlook the fact that AMD built a GPU with 700M transistors that has 320 stream processor ALUs and a 512-bit memory interface, yet it just matches or slightly exceeds the real-world performance of the GeForce 8800 GTS. The GTS is an Nvidia G80 with 25% of its shader core disabled and only 60% of the memory bandwidth of the Radeon HD 2900 XT. That's gotta be a little embarrassing. At the same time, the Radeon HD 2900 XT draws quite a bit more power under load than the full-on GeForce 8800 GTX, and it needs a relatively noisy cooler to keep it in check. If you ask folks at AMD why they didn't aim for the performance crown with a faster version of the R600, they won't say it outright, but they will hint that leakage with this GPU on TSMC's 80HS fab process was a problem. All of the telltale signs are certainly there."
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Zenoth
Originally posted by: Extelleron
Originally posted by: spittledip
They need to drop that price to make it competitive... Nvidia could raise the price on their GTS's and still get people to buy at this point.

Why would you buy an 8800GTS when you could get an HD 2900XT? Besides consuming less power, there's no real advantages to the GTS. If the GTS is $70 less (which it is now but will not be soon) then I could possibly see why, but at the same price the 2900XT is a better card. If you compared nVidia's cards at release to the 2900XT right now, things would be a different story. All ATI needs is to improve drivers and things will change.

Even right now, the 2900XT is ahead of even an overclocked GTX in Rainbow Six Vegas, which is based on the Unreal 3 engine. UT 2007 and numerous other games are based on that engine, and it is seen as one of the next-generation gaming engines (along with CryTek 2). If the 2900XT shines here, it will likely shine in those games.

The two things that I think are causing the HD 2900XT to not perform as well as it should are a) only 16 TMU's, and b) immature drivers (especially with AA performance). The majority of games today are still very texture dependent and the 8800GTX with 32 TMU's has a huge advantage over the HD 2900XT's 16. This is one of the reasons, along with drivers, the 2900XT doesn't see a huge advantage over the X1950XTX in some games - it only has slightly more texture power from the higher clockspeed (750MHz vs 650MHz). "Next-gen" games like Company of Heroes and those based on the Unreal 3 engine are much more shader dependent, and here the HD 2900XT shines.

R600 has inferior Anisotropic-Filtering, and that is hardware-based, it won't change through driver updates.

R600 has inferior power-consumption efficiency for its given performance than the GTS and GTX.

R600 is still at the moment more expansive than a GTS 640MB at some places, and it is now that is important. Being less expansive "soon" isn't relevant, when you launch a product you want people to be interested in it when it gets shown, not a month of two later.

Let me tell you this, I would have been the first to write something against G80, I like ATi, I bought six cards from them in total since the past four years, but this time I see no real advantages to the HD 2900XT, to take back your words.

If you stare at a SS all day long, you might see a difference between 8xxx series and HD 2xxx series IQ wise. I don't know about you but I don't buy a card to stare at still screenshots looking for subtle and otherwise not noticeable differences. The only difference I could really see was in BF2142.

At launch, the GTS was not $320 either, it was $450 or perhaps higher. The fact that you can get an XT today for $410 ($10 more than MSRP) is pretty good. The prices will drop.


 

bigsnyder

Golden Member
Nov 4, 2004
1,568
2
81
I think the folks holding out for a miracle from the drivers are going to be disappointed. AMD/ATI
has had an extra six months to work on them. If this beast was released "on time", then there would
be an argument for the issue.

C Snyder
 

IeraseU

Senior member
Aug 25, 2004
778
0
71
I have to say that I really like the HDMI output with integrated audio. This will be super for HTPC use. The performance is inline with an 8800 GTS, but the power consumption and price make it not quite as solid a buy for strictly gaming. To me it's still a good buy because I am as interested in HTPC use and the HDMI addition and free game bundle end up making the package relatively attractive, more so with the no doubt forthcoming price cuts.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: bigsnyder
I think the folks holding out for a miracle from the drivers are going to be disappointed. AMD/ATI
has had an extra six months to work on them. If this beast was released "on time", then there would
be an argument for the issue.

C Snyder

Recent driver releases have seen performance improvements of 11-42%, and there are clearly (if you've read any of the reviews) major issues with the Hd 2900XT that are driver problems (such as the 2900XT performing worse than an X1950XTX in one or two scenarios). There is a well-known driver bug that is inhibiting AA performance. When this is fixed we will see the power of the 512-Bit bus.

Overall I'm not too happy with current performance of the 2900XT, but I'm happy to see alot of it looks like it is a driver problem and not a hardware problem. I think that the bottom line is nVidia really came out with a better solution than anyone, including ATI, thought they would. From original rumors, G80 wasn't even going to have unified shaders and I think ATI was expecting performance to be less than what it ended up being. When G80 came and blew everything else away, ATI had to push back the launch if they were going to launch a competitive product, and scrap the 1GB GDDR4 XTX as well due to low performance.

I'm really excited, however, at what the HD 2900 looks to be capable of. In next-gen games like CoH and UE3 based games, the 2900 has great performance. Hopefully this is a sign of good things to come.
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: bigsnyder
I think the folks holding out for a miracle from the drivers are going to be disappointed. AMD/ATI
has had an extra six months to work on them. If this beast was released "on time", then there would
be an argument for the issue.

C Snyder

I'm starting to thing some of the great specs are being crippled by some bottleneck internally. 512 bit memory bus, 320 shaders is great and all but...something is holding this thing back and it isn't just drivers.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: jaykishankrk
How did they Bench for a crossfire setup when it is clearly mentioned that they have used only Intel setup in their TEST BED.. we need to know the AMD setup for test bed, so that the figures indicated are True. some may argue that the Test setup holds nothing against Game Performance, but i disagree with that conception.

Crossfire is supported on most Intel platforms. Actually, AMD claims it works on any dual x16 slot PCI-E platform.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: PingSpike
Seems pretty good. Edges out the 8800gts (except in stalker), enough for it to be competitive at that price point. But, with some downward price pressure on the 8800gts 640mb...and the drawbacks on the R600 of a louder cooler and high power consumption its kind of a wash to me.

ATI had a hefty performance advantage and vastly superior IQ last generation to make up for its crummy coolers and weak selection of board partners. Not so this time around. With how much these cards cost, I feel like the 8800gts is still a better choice.

I agree, not to mention that for about $120 more I can get a GTX and that still uses less power than a X2900
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
http://firingsquad.com/hardware/amd_ati...2900_xt_performance_preview/page19.asp

QUOTE:
Under the new build of the game, the factory overclocked GeForce 8800 GTS card is able to squeeze ahead of the Radeon HD 2900 XT under both 0xAA and 4xAA. In fact, in both cases the GeForce 8800 GTS card is able to pull away from the Radeon HD 2900 XT as resolution is increased. This is surprising considering the Radeon card?s memory bandwidth advantage.

We should hopefully see more DX10 content very soon. The DX10 patch for Company of Heroes should be released later this month, while a DX10 version of the hit Xbox 360 shooter Lost Planet is expected to be released later this summer.

DX10 benches WITH AA aren't looking to promising for ATI either...
Well, it might beat the GTS slightly, but is nowhere near the "hidden DX10 killer" others are making it out be...
 

MadBoris

Member
Jul 20, 2006
129
0
0
Originally posted by: Nightmare225
We should hopefully see more DX10 content very soon. The DX10 patch for Company of Heroes should be released later this month, while a DX10 version of the hit Xbox 360 shooter Lost Planet is expected to be released later this summer.

Company of heroes is an ATI star game. You can say Relic geared the game rendering engine very well to utilize ATI hardware or maybe you can say they did poorly for utilizing Nvidia hardware. The x1950xtx even beats out the 8800gts in COH (yet the GTS is definitely superior to the 1950 overall).

I would expect the DX10 patch for COH to be favorable to ATI as well. I would be very suprised if COH wasn't ATI's DX10 hero.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: MadBoris
Originally posted by: Nightmare225
We should hopefully see more DX10 content very soon. The DX10 patch for Company of Heroes should be released later this month, while a DX10 version of the hit Xbox 360 shooter Lost Planet is expected to be released later this summer.

Company of heroes is an ATI star game. You can say Relic geared the game rendering engine very well to utilize ATI hardware or maybe you can say they did poorly for utilizing Nvidia hardware. The x1950xtx even beats out the 8800gts in COH (yet the GTS is definitely superior to the 1950 overall).

I would expect the DX10 patch for COH to be favorable to ATI as well. I would be very suprised if COH wasn't ATI's DX10 hero.
The X1950XTX beats the 8800GTS in several tests. When pushed to the limit at high-resolutions with AA/AF, the X1950XTX is faster in several games (CoH as you say is one of them, FEAR is another).

 

Regs

Lifer
Aug 9, 2002
16,665
21
81
The whole review I got the impression that Derek was trying to say "Well at least it's not a complete flop".

Though brand loyalty is much stronger for a GPU part than it is for a CPU part. You can label brand loyalty "fanboi" all you want but people are going to buy what they most feel comfortable with.

If they want to ignore the signals that this card shows obvious drawbacks and needs a revision to show it's worth for the near future, then I almost feel sorry for them for wasting the 400 dollars for games that wont be out until later this year. DX10, I agree with many others, is for the 65nm parts coming from both Nvidia and hopefully ATi later this year.

However if you want better performance today from a x800 or x1900, and continue to be loyal to ATi, then I see no reason why not to buy the X2900. Just don't ask me to recommend it to you.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Regs
The whole review I got the impression that Derek was trying to say "Well at least it's not a complete flop".

Though brand loyalty is much stronger for a GPU part than it is for a CPU part. You can label brand loyalty "fanboi" all you want but people are going to buy what they most feel comfortable with.

If they want to ignore the signals that this card shows obvious drawbacks and needs a revision to show it's worth for the near future, then I almost feel sorry for them for wasting the 400 dollars for games that wont be out until later this year. DX10, I agree with many others, is for the 65nm parts coming from both Nvidia and hopefully ATi later this year.

However if you want better performance today from a x800 or x1900, and continue to be loyal to ATi, then I see no reason why not to buy the X2900. Just don't ask me to recommend it to you.

It really depends on the drivers for me. If drivers can improve performance by at least 15-20%, which I think they will be able to, then I think the HD 2900XT is an excellent value at $400. However, if things really don't change, then for the most part the 2900XT is a decent deal. If you intend to keep your card for a year or more, however, I would recommend the 2900XT over the 8800GTS.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MadBoris

I'm starting to thing some of the great specs are being crippled by some bottleneck internally. 512 bit memory bus, 320 shaders is great and all but...something is holding this thing back and it isn't just drivers.
is it a *feeling* ?


 

Regs

Lifer
Aug 9, 2002
16,665
21
81
Originally posted by: apoppin
Originally posted by: MadBoris

I'm starting to thing some of the great specs are being crippled by some bottleneck internally. 512 bit memory bus, 320 shaders is great and all but...something is holding this thing back and it isn't just drivers.
is it a *feeling* ?


Part of the bottle neck was explained by derek in the conclusion:


And here's what AMD did wrong:

First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses. Forcing MSAA resolve to run on the shader hardware is less than desirable and degrades both pixel throughput and shader horsepower as opposed to implementing dedicated resolve hardware in the render back ends. Not being able to follow through with high end hardware will hurt in more than just in lost margins. The thirst for wattage that the R600 displays is not what we'd like to see from an architecture that is supposed to be about efficiency. Finally, attempting to extract a high instruction level parallelism using a VLIW design when something much simpler could exploit the huge amount of thread level parallelism inherent in graphics was not the right move.

Assuming what Derek said was accurate, it's a clear case of Ati having too many "good ideas" at once. A design too complicated and ahead of its time. The card could of spent another year or two in development.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |