9600GT SLi review

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Awesome. Yeah G84 is more similar to G92 than G80... That means I can see some improvements on my core.

I doubt you would see that much difference though with high end cards. It's just that 8400gs is so hungry for bandwdith and everything else.


Whoop I can finally see your links now. Looks like it was down for a minute. You probably had it cached on your drive.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
They tested 8800gs which is just 8800gt with 1/4 of it's chip disabled.

It's mostly 2 fps increase in extreme conditions where it's not playable.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
isn't G94 adding OpenGL 2.1 support and the purevideo HD enhancements? Its basically a G92 core hardware with feature enhancements and fewer SP.
nvidia is going all out about cutting costs by reducing not much needed components... I think I get it now. The extra SPs are sitting idle on the G92 because, unlike the G80, it has much lower memory bandwidth. Sure in some games its beneficial, but not in the ones I have tested. Even bioshock, which saw a good improvement for BFG with his 8800GTX didn't show noticeable improvements for me (its hard to tell, it doesn't have a canned benchmark, so I chose a few spots and sat still there looking at whatever was on the screen and checking the fps... used save games for same spot).

So that means that by cutting mem bandiwdth (probably) in the G92 they made lots of shaders choke, which could be cut at minimal performance loss (the GTS 512 is still slightly faster then the GT and it slightly faster then the 9600).
I really hope the 9800GTX has a 512bit memory bandwidth with DDR4 and more then 128 SP. We need another beast.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
So what does it mean? That G92 can do the same thing as G94.
What it means is that it appears some of the performance of the 9600 GT comes from a driver boost, meaning that a re-test of the 8800 GT with newer drivers would likely cause a larger performance rift between it and a 9600 GT.

In particular there are some very big % performance gains in Crysis and it's a shame they didn't test other games like Bioshock or Call of Juarez.

It's mostly 2 fps increase in extreme conditions where it's not playable.
But those percentages will cause the framerate to increase more on the 8800 GT because it has a higher starting framerate.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K

So what does it mean? That G92 can do the same thing as G94.
What it means is that it appears some of the performance of the 9600 GT comes from a driver boost, meaning that a re-test of the 8800 GT with newer drivers would likely cause a larger performance rift between it and a 9600 GT.

In particular there are some very big % performance gains in Crysis and it's a shame they didn't test other games like Bioshock or Call of Juarez.

Whole 2 fps mostly with AA.



But those percentages will cause the framerate to increase more on the 8800 GT because it has a higher starting framerate.

Higher starting frame rate sure. Add 2 more fps on top of that "higher starting" frame of 8800gt.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: taltamir
isn't G94 adding OpenGL 2.1 support and the purevideo HD enhancements? Its basically a G92 core hardware with feature enhancements and fewer SP.
nvidia is going all out about cutting costs by reducing not much needed components... I think I get it now. The extra SPs are sitting idle on the G92 because, unlike the G80, it has much lower memory bandwidth.

SP doesn't sit idle. They do their part. SP doesn't dictate performance which nRollo posted some vital information from previous page. G92 fillrate sits idle however because of memory bandwidth. As games use more shader effects G94 will be shader bound and G92 will perform better.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Higher starting frame rate sure. Add 2 more fps on top of that "higher starting" frame of 8800gt.
Do you understand the concept of percentages?

Sure. 2 fps is still 2 fps. Full 8800gt will give you 2.5fps more maybe not even..
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Sure. 2 fps is still 2 fps. Full 8800gt will give you 2.5fps more maybe not even..
No, it isn't "still" 2 FPS, it's a percentage based off the base figure. Raw framerates are generally useless for GPU comparisons since a 10 FPS GPU is twice as fast as a 5 FPS GPU despite ?only? having 5 more frames.

Since you don't appear to understand I'll explain it to you.
  • Expreview showed a 17.56% performance gain in Crysis @ 1280x1024 with 4xAA.
  • Tom's showed a 8800 GT gets 28.8 FPS at the same setting.
28.8 FPS raised by 17.56% is ~33.86 FPS, or ~5 FPS faster; but again 5 FPS is largely meaningless.

What is meaningful is the 8800 GT has gone from being 15% faster to 35% faster than a 9600 GT, thereby providing a rift more in line with what we?d expect from its? relative SP count and clocks (using Tom?s 9600 GT score).

Going from a 15% to a 35% performance gain is substantial.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BFG10K
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO. Because the shaders are identical.

No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?

With both cards at 64 shaders, and identical clocks, does it matter what game is tested????? NO. Because the shaders are identical.

How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are stressed or not????????? NO. Because the shaders are identical.

Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.

Of course we have enough data. Any game at any resolution at any settings are equal between a 9600GT and a 64 shader 8800GT with the same clocks. Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders. /fini

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: keysplayr2003
Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia.

Ya, when they were giving us all the pre-launch presentation on this card one of the web reviewers asked the shader question.

The response then was the what I gave you (that the shaders are the same as G80) and the guy followed up with "How does this perform so well with half the shaders?!"

(so you guys are in line with the reviewers being surprised)

The response he got was that the only change was a data compression technique that allowed the G9X cards more efficient data handling, and the shaders weren't the limiting factor.

(roughly quoted, this was a while ago, and my wife pitched my notes because I did this on my son's desks and left the notes on top of it for weeks)
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: BFG10K
Sure. 2 fps is still 2 fps. Full 8800gt will give you 2.5fps more maybe not even..
No, it isn't "still" 2 FPS, it's a percentage based off the base figure. Raw framerates are generally useless for GPU comparisons since a 10 FPS GPU is twice as fast as a 5 FPS GPU despite ?only? having 5 more frames.

Since you don't appear to understand I'll explain it to you.
  • Expreview showed a 17.56% performance gain in Crysis @ 1280x1024 with 4xAA.
  • Tom's showed a 8800 GT gets 28.8 FPS at the same setting.
28.8 FPS raised by 17.56% is ~33.86 FPS, or ~5 FPS faster; but again 5 FPS is largely meaningless.

What is meaningful is the 8800 GT has gone from being 15% faster to 35% faster than a 9600 GT, thereby providing a rift more in line with what we?d expect from its? relative SP count and clocks (using Tom?s 9600 GT score).

Going from a 15% to a 35% performance gain is substantial.

17% yet it's still 2fps.

Now you are basing fps from 2 different websites? Way to prove a point I suppose.

You think all fps gains are same % even when dealing with different cards, settings and setup? Only if that was true. It just doesn't work that way. Until tests show it could be >=< 2fps.

Anyways there is clear evidence from the benchmarks and direct from Nvidia that what I've been saying is true all along since GDDR5 thread and many other threads which you were quick to point out I was proven wrong.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
what? You measure percentage increase decrease...

1FPS doesn't sound like a lot, and it isn't on something that normally get 100fps.
But it is a huge improvement for something that got 1 FPS.

That is doubling of performance instead of 1% increase.

That means that the part is probably twice as fast (at least under those conditions), and will probably perform impressively faster in lower quality settings... for example, get 40fps instead of 20, a noteable improvement.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: keysplayr2003
Originally posted by: BFG10K
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO. Because the shaders are identical.

No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?

With both cards at 64 shaders, and identical clocks, does it matter what game is tested????? NO. Because the shaders are identical.

How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are stressed or not????????? NO. Because the shaders are identical.

Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.

Of course we have enough data. Any game at any resolution at any settings are equal between a 9600GT and a 64 shader 8800GT with the same clocks. Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders. /fini

You have to use the 174 forceware that shipped with 9600GT on both cards for this to be true, as it has massive improvements and contributes towards some of the 9600gts stock performance.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
Originally posted by: bryanW1995
cheex, I will look down on you if you don't have dual 9800gx2's, 9900gtx when that comes out, etc. you must always have the latest and greatest hardware or you are nothing

Its a good thing that I know you are just joking...

I don't have that large a wallet...

I am thinking serious about upgrading my 320 now (or as soon as I can) though.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: jaredpace
Originally posted by: keysplayr2003
Originally posted by: BFG10K
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO. Because the shaders are identical.

No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?

With both cards at 64 shaders, and identical clocks, does it matter what game is tested????? NO. Because the shaders are identical.

How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are stressed or not????????? NO. Because the shaders are identical.

Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.

Of course we have enough data. Any game at any resolution at any settings are equal between a 9600GT and a 64 shader 8800GT with the same clocks. Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders. /fini

You have to use the 174 forceware that shipped with 9600GT on both cards for this to be true, as it has massive improvements and contributes towards some of the 9600gts stock performance.

You missed it Jared. A 9600GT and a 8800GT are identical when disabling 48 shaders on the 8800GT. Both have identical architecture, shaders, texture units, 256-bit bus and 16 ROP's. Using 174's on both cards would have the same results. I know an 8800GT should walk away from a 9600GT when more shader power is required. For some reason, BFG wants to take things further, much further, than originally intended. Blowing out the scope of our test in the process. So be it. I got what I needed, and I am willing to help him find what he needs. Just so long as he understands what I was looking for in the first place.

I'd just like to make clear the point a few seem to be missing. I know the 8800GT will be a performance leader in games that require more shader power. This is known to me. It doesn't escape me. Do not misinterpret my posts into thinking that I don't understand this.
My original intention was to see, on a per shader basis, if the G94 had architecture improvements contributing to it's better than ususal performance. I know benchmarks were conducted using a driver that took advantage of compression technology. That was not known before all this started.

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Do we have access to the 1.74 drivers? I'd like to see if my 8800GTS sees an increase. Also, does the compression assist with memory bandwidth? It would appear so.

If so, I wonder if the 8800GTS (G92) would defeat the 8800ULTRA with these new updated drivers, as AT determined that the Ultra defeated the 8800GTS due to memory bandwidth when things like AA/AF were turned on.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: keysplayr2003
Originally posted by: jaredpace
Originally posted by: keysplayr2003
Originally posted by: BFG10K
If the shaders on the G94 were superior in some way, it would have surpassed the 8800GT when the 8800GT ran at 64 shaders, not equaled it.
Not if your test didn't bottleneck the shaders enough.

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO. Because the shaders are identical.

No matter what game was used. CoD4, Bioshock, Crysis, whatever.
This is simply untrue. As an extreme example, do you believe testing GLQuake is equivalent to testing Bioshock?

With both cards at 64 shaders, and identical clocks, does it matter what game is tested????? NO. Because the shaders are identical.

How is it required to run say, CoD4 at 25x16 when all we are testing is Power per shader?
Because you need a test situation that stresses the shaders in order to test shaders. If you aren?t stressing the shaders how can you possibly test them?

With both cards at 64 shaders, and identical clocks, does it matter if the shaders are stressed or not????????? NO. Because the shaders are identical.

Do you disagree that an 8800GT at 64 shaders and exact same clocks as a 9600GT would run STALKER any differently at any resolution? How about Bioshock? Lost Planet?
We don't have enough valid data to make that judgment.

Of course we have enough data. Any game at any resolution at any settings are equal between a 9600GT and a 64 shader 8800GT with the same clocks. Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders. /fini

You have to use the 174 forceware that shipped with 9600GT on both cards for this to be true, as it has massive improvements and contributes towards some of the 9600gts stock performance.

You missed it Jared. A 9600GT and a 8800GT are identical when disabling 48 shaders on the 8800GT. Both have identical architecture, shaders, texture units, 256-bit bus and 16 ROP's. Using 174's on both cards would have the same results. I know an 8800GT should walk away from a 9600GT when more shader power is required. For some reason, BFG wants to take things further, much further, than originally intended. Blowing out the scope of our test in the process. So be it. I got what I needed, and I am willing to help him find what he needs. Just so long as he understands what I was looking for in the first place.

I'd just like to make clear the point a few seem to be missing. I know the 8800GT will be a performance leader in games that require more shader power. This is known to me. It doesn't escape me. Do not misinterpret my posts into thinking that I don't understand this.
My original intention was to see, on a per shader basis, if the G94 had architecture improvements contributing to it's better than ususal performance. I know benchmarks were conducted using a driver that took advantage of compression technology. That was not known before all this started.

yah, i'm agreeing with you keys. Just saying that if you used 169.25 or whatever nvidia is recommending as latest for the 8800gt and pit it up against a 9600gt with 174.16 whql, even when sp, rops, etc is equal, the 64sp 9600 will still beat the 64sp 8800 because of the drivers.

you have to run both cards on the 174 forceware at 64sp each, then it will be the same. the drivers that come on the 9600gt disks are 174, and provide a huge performance increase. You need to get around nvidia by using a modded inf file to make them recognize the 8800gt. When you do this, and change the sp's to 64, yes that are exactly the same card as you said. (minus being named g94 and having the 27mhz clock crystal).

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: ArchAngel777
Do we have access to the 1.74 drivers? I'd like to see if my 8800GTS sees an increase. Also, does the compression assist with memory bandwidth? It would appear so.

If so, I wonder if the 8800GTS (G92) would defeat the 8800ULTRA with these new updated drivers, as AT determined that the Ultra defeated the 8800GTS due to memory bandwidth when things like AA/AF were turned on.

I've already posted the drivers in the other thread.

http://www.laptopvideo2go.com/...index.php?showforum=89

1. you have to download the driver exe and the modded nv4disp.inf file

2. extract contents of driver to folder

3. copy modded inf file to contents of driver folder.

4. run setup.exe from folder.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: BFG10K
But those percentages will cause the framerate to increase more on the 8800 GT because it has a higher starting framerate.

Bingo. I noticed pages after this that people do not understand what a "percentage" is.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: jaredpace
Originally posted by: ArchAngel777
Do we have access to the 1.74 drivers? I'd like to see if my 8800GTS sees an increase. Also, does the compression assist with memory bandwidth? It would appear so.

If so, I wonder if the 8800GTS (G92) would defeat the 8800ULTRA with these new updated drivers, as AT determined that the Ultra defeated the 8800GTS due to memory bandwidth when things like AA/AF were turned on.

I've already posted the drivers in the other thread.

http://www.laptopvideo2go.com/...index.php?showforum=89

1. you have to download the driver exe and the modded nv4disp.inf file

2. extract contents of driver to folder

3. copy modded inf file to contents of driver folder.

4. run setup.exe from folder.

Thanks! In your PM I thought that you were referring to the mobile graphics driver. Now I know what you meant. I'll give it a shot when I get home.

 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yah, i think they just rewrite the inf with one from the 169 series to include the full geforce 8/9/mobile product line-up. The 174 original inf only includes variations of the 9 series.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
With both cards at 64 shaders, and identical clocks, does it matter if the shaders are bottlenecked???????? NO.
Of course it does!

I have two identical cars except one has 10 pistons while the other has 5 and I want to find out if the pistons are equally powerful in each car.

So I disable 5 pistons in one car to get 5 active pistons in each.

Then I get each car to pull a feather. That's right, a feather.

Since both cars pull the feather equally well can I conclude the pistons are identical from that test?

Of course not, because the feathers don't stress the pistons, much like I can't conclude the shaders are identical from your test because your tests doesn't stress the shaders.

So to reword your original question: with both cars at 5 pistons and identical otherwise, does it matter if the cars are only required to pull feathers????????

Because the shaders are identical.
But we can't infer that from your tests, much like how feathers will tell us nothing about the ability of car pistons to generate pulling power.

Just as we suspected about 10 pages ago, and now confirmed by Rollo and Nvidia. If you run the most shader intensive game on the planet on both of these cards at the highest possible resolution and settings, they will be equal when clocks are the same and both at 64 shaders.
We know that now because nVidia told us, not because of your test.

It's like the car manufacturer stepping in and telling us the pistons are identical in both cars, but we sure as hell couldn't infer that from our feather tests.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
You think all fps gains are same % even when dealing with different cards, settings and setup?
Except we aren't dealing with different cards or different settings so you clearly didn't read or comprehend the example I provided you.

Only if that was true. It just doesn't work that way. Until tests show it could be >=< 2fps.
I see little point in continuing this discussion tangent with you until such time as you gain sufficient understanding about percentages.

Anyways there is clear evidence from the benchmarks and direct from Nvidia that what I've been saying is true all along since GDDR5 thread and many other threads which you were quick to point out I was proven wrong
Not quite. We have benchmarks where I demonstrated a 1:2 ratio from increased shader power that carried over to SP increases which meant TMUs were a minor factor.

That and we know the 17x.xx drivers provide gains for the 9600 GT, and now we also know the 9600 GT secretly overclocks itself in certain situations, again narrowing the rift between it and the 8800 GT (no response from nVidia on that issue I might add).
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Ok, when I came home this evening I sat down and benched my system with 169.21 driver set. I had tested 3 games and recorded the exact settings... After I upgraded to 174.16 and ran the same tests... The difference? Sadly, there was no measurable difference. All tests ran within 1% of each other.

Settings I used were

1680 x 1050 - 4X/16X/TSAA/HQ forcd in the CP to overide any setting. Vsync DISABLED - For those wondering, I was only able to test the following:

F.E.A.R.
3DMark06
HL: Lost Coast

There was no gain at all with the settings that I used.

I was hoping it would improve my score for the settings I use, but it didn't. I didn't test with any other resolution or non-AA, because I don't play at those settings.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |