*OFFICIAL* ATI Radeon X800 PRO & XT Review Thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
how ironic. if i read the reviews correctly, Nvidia has the slightly better hardware, ATI the better fit and finish.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Shad0hawK
Originally posted by: GTaudiophile
BTW, after watching that Ruby demo movie...

Nalu's flipper is toast!


i just watched it, it did not impress me much after seeing individual strands of hair move in water(plus the ATI demo was very cheesy)

"we will meet again my dear ruby!"
*overly dramatic music plays*

ROFL!!!

i guess i will be playing farcry, stalker, battle for middle earth...etc. in PS3 with superior framerates and image quality while the ATI fanboys console themselves with the fact they only have to use one molex connector...

i really hate to burst your bubble, but FarCry ONLY ues PS3 for h20 effects . . . by the time the GF makes a diff in games, you will have upgraded to NV45/50 or r500.

That said, i am leaning toward buying a 6800u-E; it's really close, but i think nVidia will ultimately edge out the xt as their drivers mature and if their retail cores o/c.

That said, i would not mind getting the x800xt-PE . . . ultimtely it's gonna depend on price/availability and features and SOFTware included in the retail package (no GD coupons for HL-II).


After all, either one would go well in my case . . .
(and i DO have 480w of VERY quiet/stable TT power)

:roll:
 

GTaudiophile

Lifer
Oct 24, 2000
29,776
31
81
Let's not forget the X800 chips are made with Low-K while the 6800 chips are not. I have the feeling the X800 chips will scale better, much better than nVidia would like to admit.

And it's funny to me, after seeing how much trouble nVidia had with .13mu, how ATi has taken off with it, using more advanced technology.
 

Dulanic

Diamond Member
Oct 27, 2000
9,949
569
136
I see this 2 days..... ATi and Nvidia are VERY close on the top cards... however how many people buy the top cards? I think overall the GT will be the biggest hit, it has the best value, and it is being reported it OCs to beyond the performance of the Ultra Extreme! So overall I think the GT will be the most popular card out there... and it is also VERY nice to see Nvidia is taking the few issues they did have very serious, like in Far Cry. If I was looking for my next card it would probably be the GT, single slot, one molex, and OCs like crazy, and isnt as expensive as the top cards. Also nice to see Nvidia is working hard to work on problem areas like Far Cry...

Our driver team has been busy with stability, bugs and performance tweaks as we get closer to the ship dates for our OEM partners and add-in card partners. Areas of interest compared to 60.72 driver include:

5% boost in Aquamark
5% boost in X2 performance
5% boost in Tomb Raider: Angel of Darkness performance
10% boost to Call of Duty performance
50% improvements in Far Cry performance
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: GTaudiophile
Let's not forget the X800 chips are made with Low-K while the 6800 chips are not. I have the feeling the X800 chips will scale better, much better than nVidia would like to admit.

And it's funny to me, after seeing how much trouble nVidia had with .13mu, how ATi has taken off with it, using more advanced technology.
First of all, it ain't over . . .
Nvidia confirms ATI Platinum counterattack
[nVidia] would make a response to ATI's "Platinum" X800 XT.
According to the site, Neuschäfer said Nvidia's partners will create cards to counter the ATI strike, and so it looks like the graphics wars are getting ever more intense.

regarding the .13 process:


Only 10,000 GeForce 6800 series to be produced? IBM Microelectronics yields could be the problem
Various suggestions have been put forward by exceptionally well informed individuals, but more than one has basically suggested that 'yields suck at IBM' and that 'maybe other (IBM) customers are having the same issues'. We've reported on yields problems at IBM in previous weeks.

Following the launch of the ill-fated GeForce FX 5800 Ultra (NV30) we seem to recall some public comments by Nvidia, and definitely from one of its partners which, referred to process problems at TSMC, and which essentially seemed to lay significant blame at TSMC's feet for the failure of NV30.

Subsequently when Nvidia started to canoodle with IBM, we heard lots of loving noises from Nvidia PR regarding IBM. TSMC might not be too happy with Nvidia - certainly ATI is now one of its very valued customers.

Of interest re:
How ATI double whammied Nvidia over GDDR3
NVIDIA'S mid-range GeForce FX 5700 Ultra (NV36) has had a bit of a rocky history. It was first introduced with GDDR2 video memory but as stocks of GDDR2 dwindled, it seems Nvidia was more or less forced to re-equip 5700 Ultra with the new, initially more expensive, GDDR3.
The background to this seems to be that Samsung had already been burnt somewhat over its GDDR2 business with Nvidia. This was because when Nvidia's first GDDR2 based product, GeForce FX 5800 Ultra (NV30), failed to be a commercial success, Samsung was stuck with the GDDR2 it had specifically created.

This was an opportunity that ATI rapidly capitalised upon, and it subsequently bought a significant amount of GDDR2 from Samsung for its 256MB Radeon 9800 PRO. Apparently the decision was a bit of a no-brainer, as it seems the price paid by ATI for these GDDR2 modules wasn't dissimilar to the price it was already paying for the slower DDR1 modules, with which it populated its 128MB variants of Radeon 9800 PRO.

So, we understand, having been burnt once, and with the imminent arrival of GDDR3 into the market, Samsung declined to produce any more GDDR2 modules for Nvidia, leaving the graphics firm with one of two options: either go with GDDR3 or drop back to the slower DDR1.

This worked out perfectly for ATI. Not only because of the aggressive price it paid for a significant amount of high-performance GDDR2, but because in adopting GDDR3 early, Nvidia was effectively dropping the cost of this new technology, and starting to create volume for Samsung.

This was and is an important factor for ATI as next week, like Nvidia GeForce 57000 Ultra and GeForce 6800 Series, ATI will introduce it's own GDDR3 equipped 3D accelerators, the Radeon X800 Series.

Nvidia being the first to adopt GDDR3, probably reduced 5700 Ultra margins in the already cut-throat mid-range consumer graphics sector.

But there's an ironic twist to this tale.

TSMC favoured Nvidia and thought ATI's proposals were 'like the tail trying to wag the dog', the thinking way back being that Nvidia was the top dog.

So Samsung was producing GDDR2 for Nvidia's GeForce FX 5800 Ultra (NV30), but then having got burned by NV30's lack of success ? and after being quite negatively vocal about ATI for making GDDR3, it's very ironic that Samsung are now pushing GDDR3 so heavily

I believe i am still goin for the 6850/6800u variant . . .

however, it's all good . . . i'd settle for a x800xt-PE, if the price and software bundle is better.
 

EmesisBucket

Banned
May 4, 2004
16
0
0
Im going to be building a new comp soon to replace my aging P4 2.53 system. It looks as though ill be buying the X800 Pro, unless nVidia pulls out some awsome drivers when the cards finally go live.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: shady06
Originally posted by: apoppin
Finally working ati promo link . . .
(i got it 2)

Yep, a HL -2 coupon.

:roll:

I'm waitin' for the xt-PE/6850u editions

*EDIT: $15 shipping - "ground" . . . here is where ati lacks 'class'.

:roll:

if they can get suckers to pay for it, why not? i dont see it as a lack of class at all, just good business
Those ati cheapskates are charging $450.00 for their no.2 or 3 card then have the audacity to charge $15 for a GROUND (5-7 days) shipping. That's immoral . . . "class" would indicate either including the shipping charge (ala NewEgg) or at the LEAST offering OVERNIGHT . . .





AS WE HAVE PREDICTED, most lately yesterday, Nvidia's premiere competitor customer Gainward has indeed introduced a liquid-cooled 3D accelerator based upon the Nvidia GeForce 6800 Ultra XTreme.
Maintaining Gainward's tradition of product names to confuse a cunning linguist, this flagship product is identified as the CoolFX PowerPack! Ultra/2600 "Golden Sample".

With today's launch of ATi's X800 Series graphics cards, Nvidia's own attempt to dampen down the positive press ATi is currently basking in, has been a mild 50MHz kneejerk'o'clock of the original 400MHz NV40 reference boards, but memory speeds remain the same at 550MHz.

Not content with this, Gainward has hand carved a lot of the fastest 450MHz '6800 Ultra XTreme' GPU's, slipped on a trick Innovatek water block, and hope to tweak the core of these up to 150MHz higher than other GeForce 6800 Ultras.

The Gainward CoolFX Ultra/2600 will feature 256MB of full spec 600MHz GDDR3 memory which Gainward anticipate will clock beyond its rated 1200MHz DDR. All this willy waving tweakery is expected to yield further performance increases vertical of 20%.

Based upon the numbers we've seen on even the vanilla Nvidia GeForce 6800 Ultra reference samples, we think the performance potential of the CoolFX Ultra/2600 ?Golden Sample? is unlikely to disappoint. Though owning the fastest production 3D graphics accelerator bar none won't cost pennies ? in fact it'll cost in the region of a CoolFX £599, so it's sure to be as exclusive as it is rapid.

That said, CoolFX Ultra/2600 does come complete with genuine Innovatek water cooling components including a high performance heat-exchanger, low noise 120mm fan and, say Gainward, "special plastic pipes and fittings to connect all components into a sealed water circulation system".

If you must have all this high performance in a wholly silent system, then CoolFX can be extended with the Gainward CoolPC upgrade kit, which includes AMD or Intel processor and core logic water block, plus all ancillaries, and coming in at around £100 inc VAT doesn't seem to bad.

As we also reported earlier, Gainward will also introduce a somewhat less expensive advanced air-cooled variant, the PowerPack! Ultra/2600 "Golden Sample". This is another 256MB GDDR3 board based upon the 16x pipeline GeForce 6800 Ultra, but is more conservatively clocked at 430MHz core and 1150Mhz memory. Notwithstanding these lower base frequencies, the performance of PowerPack! Ultra/2600 is still likely to benefit from whatever Enhanced Settings, Gainward's EXPERTool software tweaking utility will allow.

^ that's the ONE i want.




($599)



ouch!


:roll:
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Well, the X800XT can be water cooled too, can't it?

Being a lot cooler in the first place.......
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: LTC8K6
Well, the X800XT can be water cooled too, can't it?

Being a lot cooler in the first place.......
Sure, but it isn't necessary or a much higher o/c than air-cooling - unlike the nVidia GPU.

it looks like the 400mhz core of the 6800u will EASILY take a 50mhz boost (~12% o/c) or 150Mhz (~33%) with more aggressive cooling measures . . . 33% o/c is substantial for a 200+ million transistor GPU.
 

XBoxLPU

Diamond Member
Aug 21, 2001
4,249
1
0
Originally posted by: RussianSensation


Actually, crytek later released an explanation saying the comparison was between PS3.0/2.0 and PS1.1

The fact that Nvidia explained in many interviews that PS3.0 does NOT offer image quality enhancements over PS2.0 doesn't seem to sway your opinion. PS3.0 is supposed to make things faster, which in reality suggests that you COULD write longer instruction sets and with branching allowing for deeper shaders and thus better image quality. But if you wanted to you could write the same code using PS2.0 it would just take longer. But if ATI cards already work faster in PS2.0 there is no need for them at this point to utilize PS3.0. Logically if Nvidia works slower in PS2.0, it would only make sense for it to work SLOWER in a LONGER instruction set of PS3.0. PS2.0 can do everythign PS3.0 can so image quality is IDENTICAL, just the way to get the same image is supposed to be optimized in 3.0; but that hasn't been proven yet with any game. Now nvidia is counting on running their games in less loops using PS3.0. I am not saying ATI is better and be done with it, but more games have to be compared that utilize PS3.0 to really decide if that feature is important or not.

As it stands, X800xt is the clear performance winner if you disregard price. Comparison of X800Pro to GT is another story which must be more carefully analyzed.

There is no point of arguying as the ppl who made up their minds on buying ATI or Nvidia will never change their view and then there remain the rest of us who want to wait for prices to fight over and see what the best card to buy is.

PS2.0 can't do everything PS3.0 can

SM3.0 & Displacement Mapping

One of the more major upgrades in Shader Model 3.0 is the addition of Vertex Texture Lookups. What this allows is features like Displacement Mapping. If there is going to be any major difference in image quality comparing Shader Model 3.0 to 2.0 it is going to be with the use of Displacement Mapping. Bump mapping which is currently used now to give the appearance of height in textures is just an illusion. There is no physical difference in the texture, meaning if you look at the texture from the side or dead on you will see that it is still flat, only from far away does bump mapping work. Even then it isn?t the best option since the texture is physically still flat light and shadows do not reflect correctly. The answer is Displacement Mapping which physically adds surface detail by manipulating the height of the texture. Displacement Mapping can even go as far as to create the model itself. Displacement Mapping may be a huge boon to adding realism in games. If developers pick up on this technology and we see it implemented in games, this right here could be the deciding feature that shows the most difference between a game rendered in Shader Model 3.0 and a game rendered in Shader Model 2.0.

But I pretty much agree with what you said.. also to note from the HardOCP article


Shader Lengths & Passes

First let?s look at the shader length in the Pixel and Vertex Shader. In Pixel Shader 2.0 the shader length can be up to 96 instructions long. With Pixel Shader 3.0 that instruction count has been increased to 65,535 and in the GeForce 6800Ultra the instruction count is unlimited. With a longer instruction limit more effects can be applied per pixel in a game. However, writing a shader program with a very long instruction count is not the only way to achieve an effect that would require such a long shader. Long shaders can also be achieved by using multipassing. For example you could have several 96 instruction length shaders in Pixel Shader 2.0 that perform an effect by doing them in several passes, thus achieving the same effect that one very long shader program outputs. In fact the Radeon 9800 series has built within it an F-Buffer that is meant to aid shader multipassing.

One of the most anticipated games this year Half Life 2 uses Shader Model 2.0 extensively but only has a shader length of 30-40 instructions, not even coming close the 96 instruction limit in Pixel Shader 2.0. So concerning shader length, what we are left with is going to be the performance differences between running Shader Model 2.0 programs in many passes with multipassing versus running Shader Model 3.0 with very long shaders. As of right now we have no idea how well the GeForce 6 series video cards can run very long shaders, and we also have no idea how well the Radeon series can run Pixel Shader 2.0 instructions with many passes
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Originally posted by: SmokeRngs
Originally posted by: NOX
Originally posted by: LTC8K6
Are those the shots that are actually 1.1 compared to 3.0?
These are: http://www.pcper.com/article.php?aid=36.

But the ones Shadowhawk linked to from HardOCP are 2.0 to whatever Nvidia is doing. Not to mention the HardOCP statue is not the same as the original screenshot. Not sure why, maybe they couldn't find the same one or they figure it was just for the sake of comparison.
Do not quote me on this and I don't have time to check it out right now, but I thought the nVidia screenshots were from a demo and not the actual game while the screenshots HardOCP did were from the actual game. If that is the case, it would explain the reason why the statues are not the same.
I believe you are correct. That would explain why they don't look the same.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
 

DerwenArtos12

Diamond Member
Apr 7, 2003
4,278
0
0
ok this is all well and wonderful but where and when can I actually get any one fo these new power houses?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
What about the Gainward CoolFX Ultra/2600?

It's 256MB full spec 600MHz GDDR3 memory will clock beyond 1200MHz DDR and a core clock ~600 MHz.


EDIT: WHEN? Ati's shipping pro now and xt b4 end of the month; 6800 end of month . . . more limited then i think they like.
(no discounts)
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Are those the proper prices?
Its criminal that nVidia are charging $100 more for 50Mhz extra on just the core!

As ive said before, the GT is the interesting card of the nVidia line-up. At first they`ll probably be using 12 pipe silicon that can only hit 350, but after a few months, when the process has been refined, then i`ll think we`ll see GTs with cores that can OC ALOT more.
 

ed21x

Diamond Member
Oct 12, 2001
5,410
6
81
Originally posted by: BoomAM
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Are those the proper prices?
Its criminal that nVidia are charging $100 more for 50Mhz extra on just the core!

As ive said before, the GT is the interesting card of the nVidia line-up. At first they`ll probably be using 12 pipe silicon that can only hit 350, but after a few months, when the process has been refined, then i`ll think we`ll see GTs with cores that can OC ALOT more.

I'm willing to bet the release of the Ultra Extreme was more or less a marketing ploy like the p4EE, just so the company can claim the top spot. I'm fairly certain they don't expect to sell that many of those cards, and probably only be making any money off of it through vendors like Alienware aod VoodooPC.
 

Sid59

Lifer
Sep 2, 2002
11,879
3
81
Originally posted by: ed21x
Originally posted by: BoomAM
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
Are those the proper prices?
Its criminal that nVidia are charging $100 more for 50Mhz extra on just the core!

As ive said before, the GT is the interesting card of the nVidia line-up. At first they`ll probably be using 12 pipe silicon that can only hit 350, but after a few months, when the process has been refined, then i`ll think we`ll see GTs with cores that can OC ALOT more.

I'm willing to bet the release of the Ultra Extreme was more or less a marketing ploy like the p4EE, just so the company can claim the top spot. I'm fairly certain they don't expect to sell that many of those cards, and probably only be making any money off of it through vendors like Alienware aod VoodooPC.

then to the idiots who buy them in those package.
 

Nebor

Lifer
Jun 24, 2003
29,582
12
76
Originally posted by: apoppin
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
What about the Gainward CoolFX Ultra/2600?

It's 256MB full spec 600MHz GDDR3 memory will clock beyond 1200MHz DDR and a core clock ~600 MHz.


EDIT: WHEN? Ati's shipping pro now and xt b4 end of the month; 6800 end of month . . . more limited then i think they like.
(no discounts)

I'll throw down $600 for the Gainward version.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: Nebor
Originally posted by: apoppin
Originally posted by: RussianSensation
GeForce 6800 Ultra Extreme: 450 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$599

GeForce 6800 Ultra: 400 MHz core clock, 1100 MHz GDDR3, 16 pixel pipelines, US$499

GeForce 6800 GT: 350 MHz core clock, 1000 MHz GDDR3, 16 pixel pipelines, US$399

GeForce 6800: 350 MHz core clock, 700 MHz GDDR, 12 pixel pipelines, US$299


If 6800GT overclocks to 6800Ultra Extreme speeds, you gotta be insane to pay extra $200 for the UE card.
What about the Gainward CoolFX Ultra/2600?

It's 256MB full spec 600MHz GDDR3 memory will clock beyond 1200MHz DDR and a core clock ~600 MHz.


EDIT: WHEN? Ati's shipping pro now and xt b4 end of the month; 6800 end of month . . . more limited then i think they like.
(no discounts)

I'll throw down $600 for the Gainward version.
I`d throw the salesman out of the building for trying to sell me that!.
The CoolFX was a good idea. The initial concept water blocks, designed by Bit-Tech for Gainward were awsome, how video card blocks should be, but upon release, Gainward made a mokery of the design and went with some other crappy design. The original Bit-Tech blocks were brillient. I think that they should try and sell the design to someone like AseTek or someone.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
As someone else has suggested:

I think NVidia gave reviewers cherry picked cores after they heard about R420.

Perhaps ATI will improve with new silicon?

I guess we won't know for sure until cards from both have been in the pipeline for a while, and John Q. Public has had a chance to try to blow them up.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |