Will Nvidia Project Denver Succeed?

cbn

Lifer
Mar 27, 2009
12,968
221
106
Depends on your definition of success...

I'm not sure Nvidia needs to succeed in server in order to make project Denver worthwhile.

Here are some thoughts I posted in my AMD/ARM thread regarding some possible non HPC motives behind Project Denver.

1. Nvidia has claimed Google was the company's future.

2. Nvidia AFAIK doesn't have any presence for future gaming consoles (Rumors from Feb 2009 claim Intel has won the graphics contract for Sony PS4)

3. Nvidia could kill two birds with one stone (Intel/AMD) by positioning themselves and Google as more efficient alternative to playing PowerPC console ports on Windows. More performance from less hardware because the software is programmed natively to the Platform and High Power Cortex A15? If Nvidia is already sponsoring Millions of dollars for the PC TWIMTBP program why wouldn't they do the same for Google?

4. Having a dominant share on a Desktop platform might help them expand CUDA/Physx.
 

Martimus

Diamond Member
Apr 24, 2007
4,488
153
106
Succeed at what?

To overtake Intel/AMD as the processor of choice within desktops? No, I sincerely doubt that will happen.

To grow into their own niche market? I would assume yes.

To be profitable to continue to advance and produce? This one is a big question mark. I am leaning toward no, but it would not surprise me if it became profitable at some point.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
To be profitable to continue to advance and produce? This one is a big question mark. I am leaning toward no, but it would not surprise me if it became profitable at some point.

I like this criteria.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
http://arstechnica.com/gadgets/news...enver-cpu-puts-the-nail-in-wintels-coffin.ars

NVIDIA's Project Denver CPU puts the nail in Wintel's coffin
By Jon Stokes | Last updated 8 days ago

For years there have been rumors that NVIDIA has a top-secret x86 processor project, and last November an NVIDIA exec all but confirmed that the company is looking at making an x86 chip at some point. That's why today's processor announcement from NVIDIA was both surprising and unsurprising.

No, NVIDIA didn't finally take the wraps off its x86 project—assuming that it hasn't been cancelled, that's still a secret. But the chipmaker did unveil Project Denver, a desktop-caliber ARM processor core that's aimed squarely at servers and workstations, and will run the ARM port of Windows 8. This is NVIDIA's first attempt at a real general-purpose microprocessor design that will compete directly with Intel's desktop and server parts.

The company has offered nothing in the way of architectural details, saying only that the project exists and that the company has had a team of crack CPU architects working secretly on it for some time. Indeed, NVIDIA CEO Jen-Hsuan's very brief but dramatic announcement of Denver raised more questions than it answered. However, I think I have a good idea of exactly what the first Denver-based chips will look like.

But before I try to put the pieces together, let me lay them all out on the table by walking back through the relevant section of the keynote.
Supercomputers, ARM, Windows 8

Jen-Hsuan set up the Denver announcement in a very peculiar way. After spending most of the event talking about mobiles, he suddenly put up a slide about supercomputers. Immediately, I flagged this as a change of topic to Tesla, but alarm bells started going off. Tesla is not a topic for CES—the Consumer Electronics show. In fact, supercomputing, aka high-performance computing (HPC) is not a topic for a CES presentation. This was truly the moment where I heard the record scratch and thought "what's happening here?"

Then Jen-Hsuan started talking about ARM and the power of the ARM ecosystem, with that supercomputing slide up the whole time. The press conference had now gone from curious to downright bizarre. And, just when it couldn't get any weirder, he put a Bloomberg quote about an ARM port of Windows up on the screen and basically confirmed the rumor by saying he was headed over to the Microsoft announcement shortly.

So he started out talking about supercomputers and servers, then he jumped to ARM, and then to Windows 8. I already had whiplash when he dropped the Project Denver bombshell.

After it sunk in that NVIDIA will produce a high-performance, desktop- and server-caliber, general-purpose microprocessor core, and that this processor core will power PCs running Windows, most of the picture had clicked into place. As of today, Wintel is officially dead as a relevant idea and a tech buzzword with anything more than historical significance. Sure, not much will change in the x86-based Windows PC market this year, but "Wintel" is really and finally dead as a term worth using and thinking with.

As I said, most of the picture is now complete, but there are still some pieces of this puzzle left on the table.
Missing pieces and TV

What still nags me about Denver is that ISSCC, not CES, is the place where new high-performance processor architectures are announced. This is especially true when those processors are aimed at servers and supercomputers. Announcing such a beast at CES is just strange.

The primary way that this timing makes any sense is that NVIDIA wanted to tie the unveiling to the Windows/ARM announcement, so they couldn't wait for ISSCC in September. They had to announce before Microsoft, so they did it at CES with only a few minutes to spare. I'm mostly happy with this explanation, but only mostly.

Maybe it's just the CE-heavy show environment, but I'm strongly inclined to believe that the Denver CPU is going to make its way into televisions, as well. HTML5 and Flash on an HDTV take real horsepower, both CPU and GPU—this is a job for a multicore, out-of-order processor. That's why Intel is going to be putting real x86 silicon in TVs, and the TV is going to have enough of an appetite for those clock cycles that this move will make sense.

Tegra 2 will be fantastic for phones and tablets, especially if you're looking for a phone that can double as a portable game console. But an Internet-connected TV can use even more horsepower than Tegra 2 can provide. That's where Denver could be NVIDIA's answer to Intel's CE-oriented SoC line.

Then there's Microsoft's Windows/ARM port. Microsoft clearly wants a piece of the Internet TV action—capturing this convergence moment was the whole point of the original Xbox effort within the company. But as much as the Xbox 360 does, it's still a game console, and Kinect takes it further in this direction. One of the upcoming Windows/ARM flavors could be aimed at the TV, and it could very well run on Denver. Such a combination would take on Intel's Smart TV effort directly.
Supercomputers, desktop gaming, and what Denver will look like

Ironically, despite the project's debut at CES, the consumer electronic piece of the Denver picture is the murkiest. When it comes to HPC and desktop gaming, things are a lot clearer, down to what the first Denver-based chips will look like when they launch.

A few months back, I wrote the following about AMD's Fusion project. You can read the following paragraphs, but substitute "ARM" for "x86":

It may turn out to be the case that few workloads really benefit from more than four cores, and most of those that do will run better on GPU hardware.

If this happens, then why not put those four CPU cores on a high-end GPU? In other words, in a world where Moore's Law continues to drive transistor counts up but where exceeding four CPU cores offers rapidly diminishing returns vs. a four-core + GPU combination, the best arrangement would seem to be one that looks essentially like a large GPU with four CPU cores attached to it.

Thinking about the ultimate x86 gaming system of 2015, a processor that combines four general-purpose CPU cores with a massive amount of GPU vector hardware and cache sounds ideal. With this arrangement, the relative amount of die area that goes to those four CPU cores can shrink as the (infinitely scalable) cache and vector hardware grow with transistor counts, to the point where you ultimately end up with a "GPU" that has four little CPU cores embedded in it.

Of course, you wouldn't be able to physically turn on all that hardware at once, so dynamic power optimization would be key to making such a part work. But in terms of cost, efficiency, and raw performance, it would probably beat the pants off of a 12-core x86 chip + discrete GPU combination for games and most of the other tasks people care about.

Given that the Denver core is designed to be integrated onto the same die as a GPU, and that NVIDIA is pitching it as a server and supercomputer part, it seems likely that the above describes the route that they're taking with it.

The first Denver-based products will probably be two or four high-performance ARM cores, embedded in a much larger pool of GPU vector hardware. In subsequent product generations, the core count might stay at four (or even go up to six), while NVIDIA scales the vector and cache hardware out to the horizon.

To make such a chip live up to its full potential, NVIDIA will have to do a lot more than just design a top-notch ARM core and a top-notch GPU. The company will also have to link those parts together in an optimal way—this is not an easy thing to do, and it has a huge impact on overall performance. Sandy Bridge's graphics performance is a testament to what successful die-level integration can do; the Sandy Bridge GPU itself is no great shakes, but the way that Intel has clocked it and linked it to the rest of the die makes all the difference.

If NVIDIA can execute in all three areas—CPU design, GPU design, and SoC system design—then it could potentially make one killer gaming and supercomputing CPU. But this is a very tall order, and a lot of things could go wrong here. Right now, the GPU execution part is the only one where confidence is warranted based on a track record. With the system integration stuff and CPU part, NVIDIA is in uncharted territory. (The Tegra SoC part of NVIDIA's record isn't as relevant as you might think, because Denver is a different kettle of fish entirely.)

We'll keep you posted as more details unfold. Right now, I'm currently trying to line up a deep dive briefing on Denver's core, so stay tuned.

Pretty Interesting article. Comments?
 

Hard Ball

Senior member
Jul 3, 2005
594
0
0
Depends on your definition of success...

Exactly;

The poll is a very good idea, but I think we would be able to make more concrete predictions, if the "success" is defined in a per category of devices. In a few years, you will probably see some different degree of success in each of these following market segments that pertain to today's x86 chips:

  • tablets/convertables
  • netbook
  • CULV/consumer ultraportables
  • nettop/SFF
  • full-size/multimedia laptops
  • consumer desktop
  • business/enterprise laptops
  • business workstations
  • professional workstation/gaming desktop
  • storage and database servers
  • web servers
  • high density HPC clusters
  • distributed computing
  • ... ...
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I just found this article (unfortunately it is one week old).

Any comments on how Nvidia is implementing the ARM cores and how this would help the company's HPC GPGPU efforts?

http://www.xbitlabs.com/news/cpu/di...ave_Integrated_ARM_General_Purpose_Cores.html

Nvidia Maxwell Graphics Processors to Have Integrated ARM General-Purpose Cores.

Nvidia GeForce "Maxwell" Will Feature General-Purpose Cores - Company
[01/19/2011 08:46 PM]
by Anton Shilov

Nvidia Corp. will integrate general-purpose ARM processing core(s) into a chip that belongs to Maxwell family of graphics processing units (GPUs), the company revealed in an interview. The Maxwell-generation chip will be the first commercial physical implementation of Nvidia's project Denver and will also be the company's first accelerated processing unit (APU).

"The Maxwell generation will be the first end-product using Project Denver. This is a far greater resource investment for us than just licensing a design," said Mike Rayfield, general manager of mobile solutions for Nvidia, in an interview with Hexus web-site.

Nvidia's initiative code-named Denver describes an Nvidia CPU running the ARM instruction set, which will be fully integrated on the same chip as the Nvidia GPU.

Nvidia Maxwell will be launched in 2013, it was revealed at Nvidia's GPU Technology Conference in September, 2010. Given the timeframe, it is logical to expect 20nm process technology to be used for manufacturing of Maxwell. The architecture due in almost three years from now will offer whopping 14 - 16GFLOPS of double-precision performance per watt, a massive improvement over current-generation hardware.

"Between now and Maxwell, we will introduce virtual memory, pre-emption, enhance the ability of GPU to autonomously process, so that it's non-blocking of the CPU, not waiting for the CPU, relies less on the transfer overheads that we see today. These will take GPU computing to the next level, along with a very large speed up in performance," said Jen-Hsun Huang, chief executive of Nvidia, at GTC 2010.

This is the first time when Nvidia publicly reveals timeframes for project Denver. Unfortunately, not all the details are clear at this point and it is unknown whether all members of the Maxwell family will have integrated GP ARM cores. General-purpose processing cores will bring mosts benefits for compute applications and therefore Nvidia may omit ARM from low-cost designs.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
One that can be said with certainty: If Nvidia fails to succeed it won't be for lack of trying.

Be it Intel, AMD, or Nvidia...they are all attempting to play the same game, each coming to the field with a differing mix of team strengths and weaknesses in their respective CPU and GPU IP portfolios.

A decided advantage for Intel is their access to leading edge process technology. A decided disadvantage for Intel is that their gross-margin directive prevents them from putting that leading edge process tech into all their products (look at the lag in Atom and Itanium) on a release timeline that is competitive relative to the competition.

A decided advantage for AMD is their x86 license. A decided disadvantage for AMD is their debt load and cash reserves (limits their ability to invest in themselves now for an even bigger tomorrow).

A decided advantage of Nvidia is their cash reserves and their extremely driven CEO/Founder. A decided disadvantage for Nvidia is their lack of access to the x86 market space (that is a lot of TAM that they can't access to fund their R&D and growth aspirations).

No one player has it all, and every player's fate lies in the hands of the same two groups of people...the software developers and the consumers.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
One that can be said with certainty: If Nvidia fails to succeed it won't be for lack of trying.

Be it Intel, AMD, or Nvidia...they are all attempting to play the same game, each coming to the field with a differing mix of team strengths and weaknesses in their respective CPU and GPU IP portfolios.

A decided advantage for Intel is their access to leading edge process technology. A decided disadvantage for Intel is that their gross-margin directive prevents them from putting that leading edge process tech into all their products (look at the lag in Atom and Itanium) on a release timeline that is competitive relative to the competition.

A decided advantage for AMD is their x86 license.A decided disadvantage for AMD is their debt load and cash reserves (limits their ability to invest in themselves now for an even bigger tomorrow).

A decided advantage of Nvidia is their cash reserves and their extremely driven CEO/Founder. A decided disadvantage for Nvidia is their lack of access to the x86 market space (that is a lot of TAM that they can't access to fund their R&D and growth aspirations).

No one player has it all, and every player's fate lies in the hands of the same two groups of people...the software developers and the consumers.

With respect to x86, how much advantage is that really IF MS goes Cloud for Windows 8 OS?

Will MS still sell Windows 7 alongside Windows 8, or will we see a shift over to some form of Android desktop for customers who don't want Cloud? (Recently one of the core contributors of MS Kinect jumped ship and started working for Google. More information here Not sure what this really means, but I suspect it may be a sign Google has plans some type of Console/Desktop OS)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Well NVIDIA has a lot of wealthy customers right now that they can sell their chip to. Sony (PS3), Google (Android), Microsoft (Zune). So I think it will at least be profitable for them.

The way Apple is moving it would not shock me for them to move to this chip. Ditch OSX for iOS on the desktop.

Again I suppose it depends on your measure of success. If I recall AMD is at around 20% of the desktop market. Is that a success? If so, I think NVIDIA could achieve at least that in the Tablet/Netop/Ultraportable/TV/etc. space.

I wonder how many people who voted no, just don't want it to succeed.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
With respect to x86, how much advantage is that really IF MS goes Cloud for Windows 8 OS?

Will MS still sell Windows 7 alongside Windows 8, or will we see a shift over to some form of Android desktop for customers who don't want Cloud? (Recently one of the core contributors of MS Kinect jumped ship and started working for Google. More information here Not sure what this really means, but I suspect it may be a sign Google has plans some type of Console/Desktop OS)

The advantage is in the economics that underlies how business decisions are made. The x86 marketspace is well-defined, it has a dollar-value that is assigned to it.

As an exec, telling your BoD that you are planning to spend 25% of your R&D budget developing a product that is projected to snag 5% of a well-defined x86 marketspace is something they can run numbers with and generate ROI expectations with confidence.

Telling your BoD you are doing the same but to develop a product for a currently non-existing (or fledgling) market with no tangible market value to speak to and no confidence in the projected adoption rates and software development uptakes is a whole other ball of wax. You make the pitch but your budget request is not 25%, its more like 5% or some such.

This is the disadvantage that I am speaking to. Multi-billion dollar corporations don't operate on hope and fear, it's economics and accounting. Until someone can project a market value that is believable (and not have it turn into this) there won't be much justification to make serious investments into developing products to intersect an imaginary timeline relating to an imaginary market segment that exists only in slideware today.

That isn't to say they won't try, just to say it works against them in their efforts to try. Its an intrinsic disadvantage to compete against the home team. But the opportunity for rewards are high because the home team will be a dinosaur when it comes to not pursuing new ideas (look at MS vs Google).
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I don't think the first "denver" cpu will succeed - early iterations of some wild hardware shift do nothing but lay the ground work for the future - however I think it's pretty likely that it'll morph into something very successful in the end.

IDontCare - you say that x86 is an advantage. I'd actually say not having it for nvidia is an advantage now. Intel and AMD are busy supporting legacy x86 - they aren't able to attack future markets because of it. Look at how they are missing out in the phone/tablet market because they refuse to do anything that's not x86. Nvidia have a much clearer focus - they have no legacy x86 market to loose (even their x86 chipset market is dead) and everything to gain.

If there is a major paradigm shift coming nvidia are the ones ready to embrace it - plenty of money in the bank, good software and hardware teams, stong links with TSMC to build everything, focused driven leadership, and little legacy to hold them back.
 

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0
what reason would anyone who uses a pc today have to switch to that, a slower chip? it also has no os, no applications, and no other users. if people were going to use linux they would haave already done so on intel. i predict a huge desktop failure followed by a semi-successful rebirth as a cell phone (and to a lesser degree tablet) chip. "it isnt intel" just isnt a sufficient selling point for most people
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
972
62
91
IDontCare - you say that x86 is an advantage. I'd actually say not having it for nvidia is an advantage now. Intel and AMD are busy supporting legacy x86 - they aren't able to attack future markets because of it. Look at how they are missing out in the phone/tablet market because they refuse to do anything that's not x86. Nvidia have a much clearer focus - they have no legacy x86 market to loose (even their x86 chipset market is dead) and everything to gain.

The advantage would be that they have a market of their own. While ARM threaten some of the market they have that doesn't mean that a shift will happen overnight. Also no one is stopping these 2 companies from buying an ARM license for themselves (correct me if I'm wrong) on the other hand NVidia can't get an x86 license.

This is probably also the reason why Intel and AMD won't abandon x86. They would rather conquer the said market than join a market already filled with other competitors as that would mean a market for themselves again much like it is now for their current products.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I don't think the first "denver" cpu will succeed - early iterations of some wild hardware shift do nothing but lay the ground work for the future - however I think it's pretty likely that it'll morph into something very successful in the end.

IDontCare - you say that x86 is an advantage. I'd actually say not having it for nvidia is an advantage now. Intel and AMD are busy supporting legacy x86 - they aren't able to attack future markets because of it. Look at how they are missing out in the phone/tablet market because they refuse to do anything that's not x86. Nvidia have a much clearer focus - they have no legacy x86 market to loose (even their x86 chipset market is dead) and everything to gain.

If there is a major paradigm shift coming nvidia are the ones ready to embrace it - plenty of money in the bank, good software and hardware teams, stong links with TSMC to build everything, focused driven leadership, and little legacy to hold them back.

I'm not disagreeing with much of this, but the question is how relevant is this aspect? Is it so overwhemingly relevant and unavoidable that having existing product selling into an existing marketspace suddenly becomes a liability?

This is your position, I understand that, but I don't agree with the assessment that it is to Intel's and AMD's disadvantage that they operate in a marketspace that generates some $50B/yr in CPU revenue.

I also think the tablet market is a non-starter for the same reasons the netbook has mostly been a non-starter. People don't mind the inconvenience that a full-powered laptop brings, and at the other end of the spectrum are the apps that people would just as soon do on their smartphone anyways.

(in a sense I am saying that I believe the only form-factor for tablets that will see success 5yrs from now are the phone-sized form factors)

So in my mind I see things shaking out over the next 10yrs as x86 vs. ARM in smartphone market...and the only way ARM will be competing with x86 in the latop/desktop/etc marketspace is if they somehow increase the potency of the architecture by a couple orders of magnitude above and beyond whatever improvements Intel and AMD bring to x86 in the meantime. Just my opinion, nothing more.
 

PreferLinux

Senior member
Dec 29, 2010
420
0
0
what reason would anyone who uses a pc today have to switch to that, a slower chip? it also has no os, no applications, and no other users. if people were going to use linux they would haave already done so on intel. i predict a huge desktop failure followed by a semi-successful rebirth as a cell phone (and to a lesser degree tablet) chip. "it isnt intel" just isnt a sufficient selling point for most people
Windows 8 will supposedly run on ARM, and that would include NVIDIA. "It is Intel" is a big selling point for some.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
This is the disadvantage that I am speaking to. Multi-billion dollar corporations don't operate on hope and fear, it's economics and accounting. Until someone can project a market value that is believable (and not have it turn into this) there won't be much justification to make serious investments into developing products to intersect an imaginary timeline relating to an imaginary market segment that exists only in slideware today.

Itanium, I think, was executed poorly.
Intel produced an architecture superior to x86...
but they:
1. didn't commit enough resources to it, resulting in it being several generations behind on process tech.
2. didn't even allow home users to buy it, trying to keep it as a server only product for better margins. They didn't want to cannibalize their existing x86 market so they strictly and artificially limited their titanium penetration with obvious results.

A lot of companies like to pretend that our communications and graphics technologies are not driven by pornography and video games. And it always fails when they do that.
 

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0
i think a whole new architechture by intel would have a better chance of succeeding than arm
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
Hard to answer as the definition of success is subjective.

Will it compete and put up any sort of credible performance against Intel's best or even moderate offerings ? Not a chance.

Will it fail and be relegated to being a novelty like their physx technologies, no, it will find its self a niche.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I think the first place we will see it is in netbooks...
suddenly the atom will find itself facing processors that are cheaper/as cheap, smaller, take a fraction of its power consumption... HTPCs will also benefit.
Laptops and extreme budget desktop come later, but I foresee slow penetration in those market unless we will see truly massive performance gains combined with very low prices compared to current tech. Or just massive performance increase combined with momentum from a potential dominance of netbook/laptop market. But that is fairly unlikely. there is massive gaming momentum behind current tech.

However, intel is one of those companies that fail to recognize the porn/video games rule. (along with MS for that matter).
If nvidia is smart they will work very, very VERY hard to make sure the next gen PS and xbox use an ARM based CPU (Although, looks like that boat probably sailed already). If they do that then all those console developed games will be super easy to port to windows ARM while being incompatible without significant rewrite on windows x86.
That would be a deathblow to x86... but I don't see it happening.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I'm not disagreeing with much of this, but the question is how relevant is this aspect? Is it so overwhemingly relevant and unavoidable that having existing product selling into an existing marketspace suddenly becomes a liability?

This is your position, I understand that, but I don't agree with the assessment that it is to Intel's and AMD's disadvantage that they operate in a marketspace that generates some $50B/yr in CPU revenue.

I also think the tablet market is a non-starter for the same reasons the netbook has mostly been a non-starter. People don't mind the inconvenience that a full-powered laptop brings, and at the other end of the spectrum are the apps that people would just as soon do on their smartphone anyways.

(in a sense I am saying that I believe the only form-factor for tablets that will see success 5yrs from now are the phone-sized form factors)

So in my mind I see things shaking out over the next 10yrs as x86 vs. ARM in smartphone market...and the only way ARM will be competing with x86 in the latop/desktop/etc marketspace is if they somehow increase the potency of the architecture by a couple orders of magnitude above and beyond whatever improvements Intel and AMD bring to x86 in the meantime. Just my opinion, nothing more.

I agree for Intel x86 makes a lot of sense - they make huge profits. Even if they miss the boat they are so large and rich they can spend there way out of trouble. For AMD the case isn't so sure - they don't make big profits from x86 so in a shrinking market they could be in trouble, and if they get into trouble they don't have the money in the bank to get out of it.

I think you are wrong about tablets, I think in 5 years we'll all have one. I think the iPad is a pretty flawed machine, but everyone I know who has one uses it all the time - it doesn't seem to be one of those gadgets that everyone buys, plays with for a week, then leaves to sit gathering dust for the next year.

As for ARM needing get 2 orders of magnitude more powerful. Not convinced, the counter arguments are:
1) The new soc arm architectures often do the same tasks that a traditional x86 cpu would require using 2 orders of magnitude less power by using specialised hardware.
2) The cloud is now there to help out - e.g. it can turn compute heavy word docs into something that a less powerful machine can view.
3) How much power does the average user need? Do they really need all the additional compute power that a high powered x86 machine will give them in 5 years - to do what?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |