Question DLSS 2.0

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is no thread on this, and it's worth one because it seems to finally be realising it's potential. There's many articles discussing it elsewhere - it's used by wolfenstein young blood, control and a few other games and it works really well. Allegedly it's easy to add - no per game training required. Gives a good performance increase and looks really sharp.

Nvidia article.
wccf article.
eurogamer article

The above articles have some good comparison screen shots that really demonstrate what it can do.

Discuss...
 
Reactions: DXDiag

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Freesync exists because of gsync. Gsync came out and then AMD started to think about freesync which took several years get working properly - and started basically by monitor makers with a gsync display copying it to make a freesync one. Out of gsync came variable rate displays for all. Hardly forgotten. I expect DLSS will be the same - Nvidia invent a new tech, they market it and make money from it, the rest of the market eventually catches up and it becomes ubiquitous.
 
Reactions: DXDiag

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
Freesync exists because of gsync.
Not true, VRR has been an implemented part of the VESA spec since 2009, far before gsync: https://vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also gsync requires DP1.2 minimum, which again isn't a coincidence.

Gsync came out and then AMD started to think about freesync which took several years get working properly - and started basically by monitor makers with a gsync display copying it to make a freesync one. Out of gsync came variable rate displays for all. Hardly forgotten.
How was gsync "copied"? The VRR spec pre-dates gsync by years and specifically avoids proprietary hardware or licensing fees. More like nVidia copied an open standard and slapped their tax on it.

After freesync was implemented more and more people realized the utter lunacy of locking a monitor purchase to a graphics card and also paying a monitor tax, which forced nVidia's hand.

Exactly the same thing happened with SLI. Initially it was locked to nForce chipsets until Intel forced nVidia's hand.

I expect DLSS will be the same - Nvidia invent a new tech, they market it and make money from it, the rest of the market eventually catches up and it becomes ubiquitous.
Historically a feature locked to a single vendor almost never becomes ubiquitous. I mean who here claims hardware PhysX, TXAA or 3DVision are "ubiquitous"? These technologies are dead as a dodo despite being repeatedly hyped as the second coming on this very forum.

As for DLSS 2.0, it's certainly better than the fraudulent monstrosity that is 1.0, but it still needs per-game-support and is locked to a single vendor. Unlike upscaling + sharpening which works virtually everywhere.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Not true, VRR has been an implemented part of the VESA spec since 2009, far before gsync: https://vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also gsync requires DP1.2 minimum, which again isn't a coincidence.

How was gsync "copied"? The VRR spec pre-dates gsync by years and specifically avoids proprietary hardware or licensing fees. More like nVidia copied an open standard and slapped their tax on it.
VVR in the VESA spec is not freesync. Like you said it existed in 2009 but it was many years after the first gsync display that a properly working freesync one arrived. Would it have happened without gsync, perhaps eventually but the monitor makers are lazy. They wouldn't have put much effort into developing it without gsync. What gsync did was made them build compatible displays which could handle the VVR, including variable overdrive to the very strict gsync spec, and showed them it working. That was key to developing displays that could do freesync well.
Historically a feature locked to a single vendor almost never becomes ubiquitous. I mean who here claims hardware PhysX, TXAA or 3DVision are "ubiquitous"? These technologies are dead as a dodo despite being repeatedly hyped as the second coming on this very forum.
PhysX is the most common physics engine in games today I think, the hardware bit not so much as cpu's have improved but that tech is still in use all over the place. 3DVision has now died as VR became a thing, but it is the thing that pushed super low blur high refresh rate monitors. Before that there were none, and for years the only ones that existed were for 3d vision. Now it's standard, even if you never wanted 3D you can thank 3D Vision for your high refresh rate low blur monitor.
As for DLSS 2.0, it's certainly better than the fraudulent monstrosity that is 1.0, but it still needs per-game-support and is locked to a single vendor. Unlike upscaling + sharpening which works virtually everywhere.
Your hatred of DLSS around here still burns bright I see. DLSS 2 is just better upscaling + sharpening because it uses AI hardware to enable vastly more complex algorithms. That's the same hardware as DLSS 1 used but they've improved the software, and I'm sure they will continue to improve it. What's key is the hardware, with only shaders you are more limited as you don't have the same image processing compute power.
 
Reactions: sxr7171 and DXDiag

DXDiag

Member
Nov 12, 2017
165
121
116
PhysX is the most commonly used physics API out there .. it pioneered GPU accelerated particles, which is used everywhere in current games. It's also still used in some games today, among them is Metro Exodus.

TXAA pioneered temporal AA, in a time where it was scarce to be made in any game, with time TXAA turned into TAA.

This was the only game in town when it came to 3D gaming (before the resurgence of VR), AMD's alternative was non existent and defunct a few months after it's arrival.

So really none of those things are truly dead at all. G-Sync Ultimate is the only standard right now that provides Local Dimming + HDR 1000/HDR 1400, and complete VRR experience (from 1Hz to 144Hz) + variable overdrive. FreeSync 2 can't provide this level of quality just yet.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,039
7,461
136
DX and Dribble, you make good points, but ultimately your points support the underlying argument "DLSS 2.0 is good stuff, but if its proprietary its not likely to stick around".

I don't think there is much of an argument (at least not one I'm making) against "proprietary technology can lead to industry-wide support of an open standard", but there is a argument against proprietary tech introduced by either vendor hanging around for more than a couple generations of cards.

Physx - GPGPU Physx is effectively dead at this point. Physx as a CPU driven competitor to Havok is nice, but not really unique or relevant to the conversation at large about proprietary tech since it went from "Only NV GPUs" to "Runs on all CPUs".

Gsync - NV's proprietary implementation was definitely nice and adhered to much stricter tolerances than the open VRR standard or Freesync that followed, but with NV essentially conceding the race by opening their last two gens of cards to "Gsync Compatible" aka Freesync monitors, you're going to see NV only monitors dwindle and finally disappear (not that there were many to begin with).

I just used those examples but you can also look at proprietary Renderers like Glide or more recently Mantle or Ati Truform Tessellation. Those techs eventually were scrapped and folded in to "open" (not that DX is open) standards that everyone can use.
 

Hitman928

Diamond Member
Apr 15, 2012
5,567
8,717
136
VVR in the VESA spec is not freesync. Like you said it existed in 2009 but it was many years after the first gsync display that a properly working freesync one arrived. Would it have happened without gsync, perhaps eventually but the monitor makers are lazy. They wouldn't have put much effort into developing it without gsync. What gsync did was made them build compatible displays which could handle the VVR, including variable overdrive to the very strict gsync spec, and showed them it working. That was key to developing displays that could do freesync well.

PhysX is the most common physics engine in games today I think, the hardware bit not so much as cpu's have improved but that tech is still in use all over the place. 3DVision has now died as VR became a thing, but it is the thing that pushed super low blur high refresh rate monitors. Before that there were none, and for years the only ones that existed were for 3d vision. Now it's standard, even if you never wanted 3D you can thank 3D Vision for your high refresh rate low blur monitor.

Your hatred of DLSS around here still burns bright I see. DLSS 2 is just better upscaling + sharpening because it uses AI hardware to enable vastly more complex algorithms. That's the same hardware as DLSS 1 used but they've improved the software, and I'm sure they will continue to improve it. What's key is the hardware, with only shaders you are more limited as you don't have the same image processing compute power.

The first Gsync monitors came out early 2014. The first Freesync monitors came out early 2015. How is that many years?
 
Reactions: Tlh97

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Gsync - NV's proprietary implementation was definitely nice and adhered to much stricter tolerances than the open VRR standard or Freesync that followed, but with NV essentially conceding the race by opening their last two gens of cards to "Gsync Compatible" aka Freesync monitors, you're going to see NV only monitors dwindle and finally disappear (not that there were many to begin with).
G-Sync monitors aren't going anywhere. The v2 G-Sync module is still a requirement for HDR in the G-Sync Ultimate standard and NVIDIA will open up Adaptive Sync over HDMI/DP and HDMI-VRR in the future in G-Sync module screens with new firmware. With these developments G-Sync will stay around for a while.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,039
7,461
136
G-Sync monitors aren't going anywhere. The v2 G-Sync module is still a requirement for HDR in the G-Sync Ultimate standard and NVIDIA will open up Adaptive Sync over HDMI/DP and HDMI-VRR in the future in G-Sync module screens with new firmware. With these developments G-Sync will stay around for a while.

- That's good information, but from the link provided (bolding not done by me):

It could also work with any graphics card based on the adaptive-sync standard over HDMI and DisplayPort. This means that you would be able to use a Native G-sync screen (with module) from an AMD graphics card for VRR! So if you have an AMD graphics card, you could still enjoy the VRR experience and other additional benefits that the G-sync module brings even from a Native G-sync screen, which was previously out of reach to those users.

It looks like Gsync Ultimate is going to live on as an "open" (like DX is "open", usable by anyone) standard, not something vendor locked like the original Gsync was.
 
Reactions: Tlh97

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
G-Sync monitors aren't going anywhere. The v2 G-Sync module is still a requirement for HDR in the G-Sync Ultimate standard and NVIDIA will open up Adaptive Sync over HDMI/DP and HDMI-VRR in the future in G-Sync module screens with new firmware. With these developments G-Sync will stay around for a while.

Just to be clear, GSync Ultimate is not a standard. Its more of a list of requirements that are required to receive nVidia's approval of the device.

FreeSync Adaptive-Sync is a VESA approved standard.
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,755
751
136
Just to be clear, GSync Ultimate is not a standard. Its more of a list of requirements that are required to receive nVidia's approval of the device.

FreeSync is a VESA approved standard.

Adaptive-Sync wants it's name recognised, that is the standard, FreeSync is AMD's implementation of that standard.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
DX and Dribble, you make good points, but ultimately your points support the underlying argument "DLSS 2.0 is good stuff, but if its proprietary its not likely to stick around".
I don't disagree it's proprietary and will eventually be replaced by a more open standard. That's the same with most technology - someone invents it, it's they get the early benefits then it gets copied by everyone else and eventually standardised.

The first Gsync monitors came out early 2014. The first Freesync monitors came out early 2015. How is that many years?
Hence me adding "properly working" having something that did some tiny range
with no working variable overdrive or low frame rate compensation doesn't count - remember every single Gsync display right back to the very first one did this. That's both why Nvidia were able to not support freesync for so long, and why it eventually starting working as well as it did. Even now the best branding to look for on a freesync display is "gsync compatible" because it shows it's been QA'd properly - which is in turn forcing up the quality of freesync monitors as all makers want that stamp to sell more.
 

Hitman928

Diamond Member
Apr 15, 2012
5,567
8,717
136
I don't disagree it's proprietary and will eventually be replaced by a more open standard. That's the same with most technology - someone invents it, it's they get the early benefits then it gets copied by everyone else and eventually standardised.


Hence me adding "properly working" having something that did some tiny range
with no working variable overdrive or low frame rate compensation doesn't count - remember every single Gsync display right back to the very first one did this. That's both why Nvidia were able to not support freesync for so long, and why it eventually starting working as well as it did. Even now the best branding to look for on a freesync display is "gsync compatible" because it shows it's been QA'd properly - which is in turn forcing up the quality of freesync monitors as all makers want that stamp to sell more.

"Properly working" is a loaded term. Freesync was working from the beginning as intended. Yes it obviously came out after Gsync and didn't include everything Gsync did from the start. However, saying it took "many years" is a gross exageration. AOC g2460pf came out mid 2015 and had a 35 Hz - 144 Hz VRR range. That was just 1 model. By November of 2015, AMD had low frame rate compensation in their drivers for freesync, but honestly it wasn't that big a deal considering how low most monitors could go in VRR range to begin with.

Yes you got a lot of crappy implementations with Freesync monitors, that's the price you pay with open standards. That doesn't mean there weren't quality models available. I think many people also like having the option to go with a budget option, knowing it doesn't have all the bells and whistles. What good does a Gsync certified monitor do for someone who won't/can't spend more than $100 - $200 on a monitor? Freesync gave those people options.

Lastly, Gsync compatible came around because Nvidia had to start supporting VRR through the VESA standard and those monitors do not go through the same certification that Gsync monitors go through. Nvidia basically makes sure that the freesync monitor works with Nvidia cards and that it has a 2.4:1 VRR range, then they slap the Gsync compatible label on it. That's it.

In addition, not all monitors go through a formal certification process, display panel quality varies, and there may be other issues that prevent gamers from receiving a noticeably-improved experience. . . We will test monitors that deliver a baseline VRR experience. . . G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming. They also validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1

 
Reactions: Tlh97 and Elfear
Mar 11, 2004
23,155
5,623
146
Regardless, it doesn't matter.

DLSS 2.0 with 1080p upscaled to 4k + ultra settings + maxed ray tracing on is playable on an RTX 2070.

That's incredible.

And it'll look way better than native 4k with settings turned down and no maxed ray tracing.

Nvidia's 7nm GPUs combined with DLSS 2.0 will allow mid-range cards to play in "4k" resolution with full ray tracing.

Nvidia has done an incredible job with the RTX generation. A lot of people hated on the RTX 2000 series but I think it will be remembered as one of the best ever.

Actually it does matter as it means that ray-tracing is less viable as a whole, which will limit how quickly it advances. Full ray-tracing is going to be a HUGE performance hit over this lower res'ed cheat method, which means its going to take forever or something much more than RTX/DXR/etc to push ray-tracing adoption, because of how much performance hit it will take while being able to get mostly there going the old route. We're moving further in the direction of trying to fit a bunch of different stuff together to try and make several cheats viable. What's next, Nvidia sells us on pre-rendered and we get 1TB game sizes where it streams in most assets? We go to using one eye so we can cut resolution and limit the need for image depth?

Yes that's hyperbolic, but this isn't a good sign for ray-tracing, and we're going to end up right back where we were at the start of the last decade, searching for something to enable better utilization of the hardware after all the tricks and other things they'd resorted to just made things stupidly complex and difficult to make work well while also constantly having performance issues.

Heck, the reason why DXR supposedly is going to be good is that it'll let them bypass a whole bunch of the lighting stuff they'd done. Why are they needing DLSS to bring clarity that ray-tracing was supposed to? Now they're seriously trying to sell us that ray-tracing offers a performance boost. What happened to that?

I'm really wondering if it wouldn't be better to just go back full force on straightforward rendering, just reworked from the ground up with modern knowledge in mind. That's kinda what we were sold with regards to ray-tracing but it sure looks like we're looking for every type of cheat we can find to force that while making things look as good as they would if we wouldn't have even bothered with that.
 
Reactions: Tlh97 and Elfear

AtenRa

Lifer
Feb 2, 2009
14,003
3,360
136
I expect DLSS will be the same - Nvidia invent a new tech, they market it and make money from it, the rest of the market eventually catches up and it becomes ubiquitous.

DLSS 2.0 was created by using the tech Microsoft/AMD/NVIDIA/Intel created for DirectML in DX-12. Same for RTX, this is the same tech MS/AMD/NVIDIA and Intel among others, created for the DXR in DX-12.
NVIDIA didnt invent the technology of Deep Learning Super Sampling and Super Anti-Aliasing that DLSS 2.0 is using.

DirectML Super Resolution Neural Network in DX-12 Siggraph 2018

Granted NVIDIA was the first to implement those technologies in to working products in the consumer market, but that doesnt make NVIDIA the ones to invent those technologies.

Same goes for the Tensor Cores, those were invented/created by Google for their TensorFlow inference tech. NVIDIA used Tensor Cores in order to sell GPUs for TensorFlow and DL. They were the first to implement Tensor Cores in consumer market but they didnt invent them.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,360
136
PhysX is the most common physics engine in games today I think, the hardware bit not so much as cpu's have improved but that tech is still in use all over the place.

NVIDIA made PhysX an open source in 2005, its not proprietary tech anymore.
I dont know about today but before PhysX was made an open source, it was Bullet (open source physics engine) that was used in the majority of games.
 

DXDiag

Member
Nov 12, 2017
165
121
116
They were the first to implement
The first to implement is often the co creator or the co inventor. Denying the efforts NVIDIA put into a lot of the standards you are using today seems pretty immature to me. You start with the Hardware T&L (the GPU as we know it) and you follow the line till DXR/ AI Super Sampling.
Freesync was working from the beginning as intended.
No Adaptive VRR was a half assed half baked standard, FreeSync just copied it, G-Sync was a fully developed standard with harsh quality requirements: complete refresh range (1Hz to 120Hz), Variable OverDrive, HDR with local dimming .. etc, FreeSync lacked all of that, AMD patched it later with FreeSync 2, but still not to the full extent as NVIDIA. G-Sync Ultimate remains the leading VRR with HDR standard to this day.
AOC g2460pf came out mid 2015 and had a 35 Hz - 144 Hz VRR range
One model with lackluster refresh range (that doesn't include 1Hz) is barely any good.

AMD had low frame rate compensation in their drivers for freesync, but honestly it wasn't that big a deal considering how low most monitors could go in VRR range to begin with.
Driver level is not the same as the monitor supporting it. Driver level was very limited to certain monitor capabilities.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,360
136
The first to implement is often the co creator or the co inventor.

Not this time.

Denying the efforts NVIDIA put into a lot of the standards you are using today seems pretty immature to me. You start with the Hardware T&L (the GPU as we know it) and you follow the line till DXR/ AI Super Sampling.

Nobody here denies the efforts NVIDIA has made to the hardware and gaming industry.
 
Reactions: Tlh97

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The first to implement is often the co creator or the co inventor. Denying the efforts NVIDIA put into a lot of the standards you are using today seems pretty immature to me. You start with the Hardware T&L (the GPU as we know it) and you follow the line till DXR/ AI Super Sampling.

No Adaptive VRR was a half assed half baked standard, FreeSync just copied it, G-Sync was a fully developed standard with harsh quality requirements: complete refresh range (1Hz to 120Hz), Variable OverDrive, HDR with local dimming .. etc, FreeSync lacked all of that, AMD patched it later with FreeSync 2, but still not to the full extent as NVIDIA. G-Sync Ultimate remains the leading VRR with HDR standard to this day.

One model with lackluster refresh range (that doesn't include 1Hz) is barely any good.


Driver level is not the same as the monitor supporting it. Driver level was very limited to certain monitor capabilities.

Nobody denies what work nVidia has put into the gaming industry. But having such massive tunnel vision that they did it alone is extremely close minded. DirectX is developed by many companies in conjunction together. Nothing that uses DirectX gets done without the others involved knowing.

Second, and again, G-Sync is *NOT* a standard! And there was nothing "half baked" about Adaptive Sync. Third, the original implementation of G-Sync did NOT have HDR support! HDR monitors didn't exist until years after G-Sync first came out.

And you keep bringing up this ability to go down to 1Hz, which literally impacts nobody in the desktop world. There really is no reason to need anything below 30. Because if your frame rates are that low, you need to turn some settings down because the game is already unplayable. The whole 1Hz thing was intended for LAPTOPS only, to save energy. But very few actually used it.

There were plenty of side by side tests done between nothing, G-Sync, and Adaptive sync. And the vast majority said G-Sync and Adaptive-Sync looked the same. The others chose one or the other, but did not swing one or the other as a winner.
 
Reactions: Tlh97 and Elfear

Hitman928

Diamond Member
Apr 15, 2012
5,567
8,717
136
No Adaptive VRR was a half assed half baked standard, FreeSync just copied it, G-Sync was a fully developed standard with harsh quality requirements: complete refresh range (1Hz to 120Hz), Variable OverDrive, HDR with local dimming .. etc, FreeSync lacked all of that.

Good to know there are people gaming out there at 1 fps. . .

Seriously though, almost everything you're mentioning has nothing to do with VRR. Local dimming, what does that have to do with VRR? Nvidia had an expensive solution for VRR which meant that the monitors that came with it were going to be expensive. Instead of making it work through the established standard and make it cheap, they kept it as a premium solution and so to support a premium brand, they had to make sure that the monitors that had Gsync were premium monitors to justify the price. That's it. It was a marketing play, not some great technical achievement that all Gsync monitors had so many features. Freesync changed all that and gave VRR to pretty much any monitor that wanted to implement the standard which was really cheap to do. This forced Nvidia to eventually support Freesync monitors with their Gsync compatible marketing.

AMD patched it later with FreeSync 2, but still not to the full extent as NVIDIA.

How do you patch marketing? That makes no sense. BTW, AMD has moved past Freesync 2 and now was a couple of tiers I believe (Premium and Pro or something like that). Again, this is just marketing that says these monitors with these stickers have a minimum set of features, there's nothing technical about it.

G-Sync Ultimate remains the leading VRR with HDR standard to this day.

As was already pointed out, Gsync didn't have HDR from the beginning because HDR didn't exist when Gsync launched. "Freesync 2" with HDR support was actually announced at the same time Nvidia announced HDR support for Gsync although (if memory serves me correctly) it took a while for monitors supporting either to actually come out.

One model with lackluster refresh range (that doesn't include 1Hz) is barely any good.

35 - 144 Hz is lackluster range? What? How many Gsync monitors actually go down to 1 Hz? How many go below 30 Hz? I'll give you a hint, it's zero. That was the whole purpose of LFC which AMD enabled through its drivers (more on that next).


Driver level is not the same as the monitor supporting it. Driver level was very limited to certain monitor capabilities.

AMD's solution happens at the driver level so it's not about the monitor supporting it. The only caveat is that there needs to be a wide enough range in VRR for it to be activated. Same as Gsync.

 

FiendishMind

Member
Aug 9, 2013
60
14
81
DLSS 2.0 was created by using the tech Microsoft/AMD/NVIDIA/Intel created for DirectML in DX-12. Same for RTX, this is the same tech MS/AMD/NVIDIA and Intel among others, created for the DXR in DX-12.
NVIDIA didnt invent the technology of Deep Learning Super Sampling and Super Anti-Aliasing that DLSS 2.0 is using.

DirectML Super Resolution Neural Network in DX-12 Siggraph 2018
Am I missing something? In that presentation, the MLSS/DLSS model is specifically credited to Nvidia. It seems very much like Nvidia's baby to me.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
the fact the developer has to enable it, even if its really small, is still the issue. It's a catch-22. The kinds of games I could really use the DLSS speed up are the poorly coded, unoptimized games that don't run well despite me having a very fast rig. Like Escape from Tarkov for example. The games that have DLSS enabled are already big name titles with significant optimization put into them, meaning I don't even need the feature. Once I can high quality upscale lower resolution for badly optimized games like Ark, EFT, etc. then it will be an absolute killer feature
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The reason a lot of badly optimised games are like that is they've decided to write the graphics engine themselves or tweaked the life out some ancient one they've been using for the past 10 years. I tend to agree - most won't get DLSS. Most new games however will be using EPIC/Unity/etc and those engines will have built in support so it'll be very easy to add I expect.

As for where it's key, well ray tracing, it makes it something you can run today on all the current RTX cards without massive sacrifices elsewhere. When ray tracing first came out most people agreed it would be useless on this gen of cards, in fact would take 3-5 years before we could use it. With DLSS 2 that's just not true, even the lowest end 2060 runs ray traced games pretty well. I expect every ray traced game will support it, so it's going to be in the games that most need it.
 
Reactions: DXDiag
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |