Larrabee

lopri

Elite Member
Jul 27, 2002
13,218
600
126
I was surprised to find the first live demo of Larrabee, then surprised again how little attention it garnered. If Intel is launching Larrabee next year, is it still 'too early to tell'?

Anyhow, it looks like the demo @IDF didn't impress the press very much. Here we see Anand tags it as proof of concept, and here we see Bit-Tech's being more up-front despite Intel's explanation.

Finally I'll quote this news item from TechReport which links a video of the event. The video shows real-time ray tracing of Larrabee on a Gulftown (6-core) system.

I am not sure what to make out with this demo. On one hand the demo indeed does look awful (Jell-O water?) but how much weight should be put on the fact that it is done via ray tracing? And what's the odd of Intel holding something better close to its chest?

Towards the end of the video, the guy mentions something called "Task Parallelism(?)" and flying objects in a same line, and something about GPU cores functioning as CPU cores. What came to my mind at the moment was "IceStorm Fighters", which to date is the only 'game' that I know to use multi-core CPUs in a truly parallel fashion.

Discuss?
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Originally posted by: lopri
Towards the end of the video, the guy mentions something called "Task Parallelism(?)" and flying objects in a same line, and something about GPU cores functioning as CPU cores. What came to my mind at the moment was "IceStorm Fighters", which to date is the only 'game' that I know to use multi-core CPUs in a truly parallel fashion.

I asked about this in a different thread and got no response at all. Since the Larrabee cores are fully x86 could they be used as an additional "CPU" to provide additional processing muscle when not being used for heavy graphics duty?

Would almost be like having a dual processor setup, I just wonder how much power the additional cores would bring to the table.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Perhaps because it is running an older game?

btw I found the physical size of the i7 on 1000nm interesting. I actually expected it to be a lot bigger.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Genx87
btw I found the physical size of the i7 on 1000nm interesting. I actually expected it to be a lot bigger.

It would have been had they actually required the design/layout to result in an electrically functioning product. What they did there was just assume a linear dummy expansion of equivalent proportions to scale up the dimensions of the existing 45nm chip by ~20x.

It provided a conservative lower-estimate of what a product would have been sized at, no harm in underestimating with your marketing gimmicks so no one has any reason to cry foul over the technicalities.

Originally posted by: Denithor
Originally posted by: lopri
Towards the end of the video, the guy mentions something called "Task Parallelism(?)" and flying objects in a same line, and something about GPU cores functioning as CPU cores. What came to my mind at the moment was "IceStorm Fighters", which to date is the only 'game' that I know to use multi-core CPUs in a truly parallel fashion.

I asked about this in a different thread and got no response at all. Since the Larrabee cores are fully x86 could they be used as an additional "CPU" to provide additional processing muscle when not being used for heavy graphics duty?

Would almost be like having a dual processor setup, I just wonder how much power the additional cores would bring to the table.

I must have missed it then as I would have provided you with a few links attesting to the very high probability that the larrabee ISA and architecture will most undoubtedly be leveraged in future compilers and programs.

Larrabee GPU/CPU

Larrabee versus Pentium

Larrabee shematic

There are many possibilities evolution Larrabee

TONS of embedded Intel slides on what is Larrabee

100-core CPU is the next target period

Speculation on integration timeline for LRBi and x86 ISA
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,174
126
I was looking forward to Larrabee but have been fairly unimpressed so far...especially if they're going to launch early next year (which I think is highly unlikely now).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I read earlier that larrabee would offer "gtx285-type" performance. That's pretty good today, but in a year it will be midrange at best. Plus, since all current games are designed for nvidia/ati, how many titles will even run at a reasonable level on larrabee for a while?
 

Udgnim

Diamond Member
Apr 16, 2008
3,665
112
106
I think most people assume that Larrabee will be behind in performance in comparison to whatever ATI or Nvidia has to offer.

so until there are some real benchmarks, I doubt people will be hyped for Larrabee
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
That "real-time" RT demo was running a very outdated ET:QW engine with not much pixel to push - and even that was close to the single-digit framerate ghetto...


...from any practical PoV Larrabee's RTing is pretty much a failure as it is now.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Originally posted by: Udgnim
I think most people assume that Larrabee will be behind in performance in comparison to whatever ATI or Nvidia has to offer.

so until there are some real benchmarks, I doubt people will be hyped for Larrabee

So far what we have seen doesn't really come across as a viable solution - it really-really-really reminds me to the famous i740 fiasco.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Intel should fire the PR guy they had make the call for that video conference, that video of Larrabee in action was absolutely hideous and showed numerous shortcomings of Ray Tracing.

Pay close attention to the reflections on the water, occasionally the reflection of the ship will have holes through it as the angle of the waves change. This can be handled by Ray Tracing with enough bounces(not as well as other rendering methods) but clearly they had massive issues with performance pushnig a very simplistic scene as it was.

Lack of diffuse makes the entire game look too flat, and far older then it actually is.

I really don't get the point to that video clip, where ray tracing is at its strongest they failed to run it with settings that would allow it to shine making it look inferior to numerous technologies available for years for handling reflections, they didn't utilize a scene to take advantage of shadowing and they showed a scene that the system was struggling rendering resulting in horrible framerates. A bad call all around by Intel, if that's the best they can do, they are going to get killed by integrated solutions when they launch.

Since the Larrabee cores are fully x86 could they be used as an additional "CPU" to provide additional processing muscle when not being used for heavy graphics duty?

With proper support yes, but it won't take code the i7 can crank through and run it well. The cores on Larrabee are in order and rely heavily on vectorization for strong performance. This is actually where Intel seems to be focusing almost all of its effort- Larrabee is a counter to CUDA, not as a serious graphics solution(although I'm sure they would love to dominate that market, they don't seem willing to make compromises to their GPGPU functionality to increase normal GPU performance).
 

lopri

Elite Member
Jul 27, 2002
13,218
600
126
@Idontcare: Those diagrams make my head spin. Thank you for the information but I must confess that I don't understand half of the stuff. (generously speaking)

On a more pedestrian side: Does anyone have an idea at what resolution the demo was run? And assuming, for a moment, that ray tracing was ubiquitous as rasterization today - How would AMD/NV's offering fare at that ray tracing demo?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: BenSkywalker
This is actually where Intel seems to be focusing almost all of its effort- Larrabee is a counter to CUDA, not as a serious graphics solution(although I'm sure they would love to dominate that market, they don't seem willing to make compromises to their GPGPU functionality to increase normal GPU performance).

That actually makes a lot of sense if we contemplate the possibility that the motivation for Intel wanting to jump into the discreet GPU business was actually a preemptive move to defend against the GPGPU invading their computing dominion. (the best defense is a good offense philosophy)

If this was the motivation, versus say the motivation being a desire to diversify into alternative revenue sources such as the existing GPU marketspace, then it would make sense that their priorities are really to make sure the product can service the GPGPU industry and if it can be sold as a discreet GPU then that is just extra.

It does ever so remind me of hubris Intel projected with their HDTV announcements:

Intel shows off giant screens

Intel delays first TV chip

Intel kills TV chip plans

In the case of HDTV they pulled out once the gross margin story came out (GM's suck, as you can imagine).

We know the gross margin story of the discreet GPU business...they suck too, when was the last annual profits reported for NV or AMD's GPU division?

So we can already ask the rational question - what does Intel think the gross margins are going to be like for Larrabee?

Originally posted by: lopri
@Idontcare: Those diagrams make my head spin. Thank you for the information but I must confess that I don't understand half of the stuff. (generously speaking)

Basically it just says that Fusion-type products in which there is an actual synergy (benefit) to having an accessible ISA on the GPU is plausible for the embarrassingly parallel workloads out there...exactly what CUDA is going after.

Do you remember those PCI co-processor accelerators that were made for Photoshop back in the mid-nineties? The idea has existed for ages, but with larrabee the software support (compiler) is nearly already there because Larrabee is x86 already so the applications could take a dramatic jump. (just speculating, I don't actually know of anything that is going to happen)
 

lopri

Elite Member
Jul 27, 2002
13,218
600
126
How about this interview of Carmark in retrospect? About a year and a half ago:

From the developers stand point there are pros and cons to that. We could certainly do interesting things with either direction. But literally just last week I was doing a little bit of research work on these things. The direction that everybody is looking at for next generation, both console and eventual graphics card stuff, is a "sea of processors" model, typified by Larrabee or enhanced CUDA and things like that, and everybody is sort of waving their hands and talking about ?oh we?ll do wonderful things with all this? but there is very little in the way of real proof-of-concept work going on. There?s no one showing the demo of like, here this is what games are going to look like on the next generation when we have 10x more processing power - nothing compelling has actually been demonstrated and everyone is busy making these multi-billion dollar decisions about what things are going to be like 5 years from now in the gaming world. I have a direction in mind with this but until everybody can actually make movies of what this is going to be like at subscale speeds, it?s distressing to me that there is so much effort going on without anybody showing exactly what the prize is that all of this is going to give us.

http://www.pcper.com/article.php?aid=532
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
The demo is kinda disappointing, but I guess the thing to take home is that the silicon works at least. Either they are having trouble or the silicon barely works(or even both).
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Wow, I remember when Larrabee info just leaked out and everyone was predicting the deaths of Nvidia/ATI.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I never underestimate Intel but I really expected a lot more for their first showing. I don't understand their love for Ray Tracing other than they figure they can create a processor to handle ray tracing much better than they can doing a special graphics processor. Besides, how many years has Intel been in the graphics business and yet to produce a good integrated graphics processor?
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Originally posted by: AmdInside
how many years has Intel been in the graphics business and yet to produce a good integrated graphics processor?

Intel does have a bigger piece of the graphics market than any other company, so I don't think they're too worried about being the "best" when they can be the "best seller."
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
The demo is kinda disapointing.

But Intel is saying that the currentl Larrabee is early in developent and it can give only 10% of the supposed final product performance (according to Intel...)

If this is true, then if you consider this, it is not so bad i guess.
On the other hand, 1/10 perf. it's like the difference we have between a 4550 and a 1GHz 4890.

What is the resolution of the demo?

I guess a 4550 can do classic rasterised QW with highest quality settings and with 4X AA 16AF at around 25-30fps (1280X720).
Anyway, it was a bad decision to use the QW brown lol art assets.

I have seen some raytracing demos and they look superb (but these was not real-time demos, so it doesn't count)

-------------------------------------------------------------

ATI CTO on Intel Larrabee

http://www.youtube.com/watch?v=ODwWjO-x9Is

Nvision: Nvidia CEO on Intel Larabee

http://www.youtube.com/watch?v...GLyYU4&feature=related
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: lopri
How about this interview of Carmark in retrospect? About a year and a half ago:

From the developers stand point there are pros and cons to that. We could certainly do interesting things with either direction. But literally just last week I was doing a little bit of research work on these things. The direction that everybody is looking at for next generation, both console and eventual graphics card stuff, is a "sea of processors" model, typified by Larrabee or enhanced CUDA and things like that, and everybody is sort of waving their hands and talking about ?oh we?ll do wonderful things with all this? but there is very little in the way of real proof-of-concept work going on. There?s no one showing the demo of like, here this is what games are going to look like on the next generation when we have 10x more processing power - nothing compelling has actually been demonstrated and everyone is busy making these multi-billion dollar decisions about what things are going to be like 5 years from now in the gaming world. I have a direction in mind with this but until everybody can actually make movies of what this is going to be like at subscale speeds, it?s distressing to me that there is so much effort going on without anybody showing exactly what the prize is that all of this is going to give us.

http://www.pcper.com/article.php?aid=532

I thought there was that Project Offset or some such for Larrabee floating around out there.

You know we are just begging for Nemesis to come in here and start tossing up links like a Bing search on crack, right?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: BenSkywalker
Intel should fire the PR guy they had make the call for that video conference, that video of Larrabee in action was absolutely hideous and showed numerous shortcomings of Ray Tracing.

Pay close attention to the reflections on the water, occasionally the reflection of the ship will have holes through it as the angle of the waves change. This can be handled by Ray Tracing with enough bounces(not as well as other rendering methods) but clearly they had massive issues with performance pushnig a very simplistic scene as it was.

Lack of diffuse makes the entire game look too flat, and far older then it actually is.

I really don't get the point to that video clip, where ray tracing is at its strongest they failed to run it with settings that would allow it to shine making it look inferior to numerous technologies available for years for handling reflections, they didn't utilize a scene to take advantage of shadowing and they showed a scene that the system was struggling rendering resulting in horrible framerates. A bad call all around by Intel, if that's the best they can do, they are going to get killed by integrated solutions when they launch.

Since the Larrabee cores are fully x86 could they be used as an additional "CPU" to provide additional processing muscle when not being used for heavy graphics duty?

With proper support yes, but it won't take code the i7 can crank through and run it well. The cores on Larrabee are in order and rely heavily on vectorization for strong performance. This is actually where Intel seems to be focusing almost all of its effort- Larrabee is a counter to CUDA, not as a serious graphics solution(although I'm sure they would love to dominate that market, they don't seem willing to make compromises to their GPGPU functionality to increase normal GPU performance).

So do you feel that it's sort of a bait and switch? That Intel may be trying very hard to mislead GPGPU makers into thinking they are going all out to be competitive in the graphics market, when all along, ever since the fostering of the idea of Larrabee, was to get a multicore CPU to perform, or outperform massive parallel tasks such as CUDA or Stream?......................... And can also run graphics satisfactorily as todays IGPs?

A lot to swallow, but anything is possible.
 
May 11, 2008
20,260
1,150
126
Originally posted by: Idontcare
Originally posted by: BenSkywalker
This is actually where Intel seems to be focusing almost all of its effort- Larrabee is a counter to CUDA, not as a serious graphics solution(although I'm sure they would love to dominate that market, they don't seem willing to make compromises to their GPGPU functionality to increase normal GPU performance).

That actually makes a lot of sense if we contemplate the possibility that the motivation for Intel wanting to jump into the discreet GPU business was actually a preemptive move to defend against the GPGPU invading their computing dominion. (the best defense is a good offense philosophy)

If this was the motivation, versus say the motivation being a desire to diversify into alternative revenue sources such as the existing GPU marketspace, then it would make sense that their priorities are really to make sure the product can service the GPGPU industry and if it can be sold as a discreet GPU then that is just extra.

It does ever so remind me of hubris Intel projected with their HDTV announcements:

Intel shows off giant screens

Intel delays first TV chip

Intel kills TV chip plans

In the case of HDTV they pulled out once the gross margin story came out (GM's suck, as you can imagine).

We know the gross margin story of the discreet GPU business...they suck too, when was the last annual profits reported for NV or AMD's GPU division?

So we can already ask the rational question - what does Intel think the gross margins are going to be like for Larrabee?

Originally posted by: lopri
@Idontcare: Those diagrams make my head spin. Thank you for the information but I must confess that I don't understand half of the stuff. (generously speaking)

Basically it just says that Fusion-type products in which there is an actual synergy (benefit) to having an accessible ISA on the GPU is plausible for the embarrassingly parallel workloads out there...exactly what CUDA is going after.

Do you remember those PCI co-processor accelerators that were made for Photoshop back in the mid-nineties? The idea has existed for ages, but with larrabee the software support (compiler) is nearly already there because Larrabee is x86 already so the applications could take a dramatic jump. (just speculating, I don't actually know of anything that is going to happen)

I think you are right on the spot.


Medical imaging by using nvidia gfx cards and cuda speeding up data processing from several hours to +30 minutes. I wish i had more links but am unable to find what i read

gpgpu medical

Now this is a huge moneymaking field.

Let's look at movie making:
People (i know i am) love movies with computer generated effects.
Even when these effects are not always obvious. With use of gpgpu less and less dangerous scenes have to be shot because you can add in all the effects later.


I still think Intel is looking further ahead in the future plotting different scenario's.
Ray tracing is coming. I think ray tracing can be used for more then just lightning models.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
On a more pedestrian side: Does anyone have an idea at what resolution the demo was run? And assuming, for a moment, that ray tracing was ubiquitous as rasterization today - How would AMD/NV's offering fare at that ray tracing demo?

This is a bit of a loaded question. If nV/AMD ran that exact same demo right now their boards would suck running it, much as Larrabee does. The big difference is that Larrabee is built with nigh no compromises to run exactly that, while AMD/nV just would happen to be able to because of what their hardware is capable of, a huge portion of their die space would simple be idle while doing it. There are many, many reasons why ray tracing isn't a good choice for real time rendering- nVidia and ATi both know this quite well(dynamic geometry and diffuse being huge stumbling blocks for ray tracing, both key elements of games). Intel made a gamble that rasterizers would stop scaling, at which point ray tracing would start to catch up(you could have a million cores working on a ray traced scene with perfect scaling as long as the support sytems were in place). The problem for Intel, rasterization is scaling as fast as it ever has- so the rift is growing at a significant pace still, it will be a very long time before ray tracing comes close to being even with rasterization for real time(likely more then a decade at least- and that is just talking speed- in terms of feature parity it will take much longer).

If this was the motivation, versus say the motivation being a desire to diversify into alternative revenue sources such as the existing GPU marketspace, then it would make sense that their priorities are really to make sure the product can service the GPGPU industry and if it can be sold as a discreet GPU then that is just extra.

Don't get me wrong, I think Intel would love to own the GPU gaming market, and the console space as well with Larrabee, but it appears to be a rather distant secondary concern.

So do you feel that it's sort of a bait and switch?

I think they are using the PCI-E slot as a Trojan horse for their next progression of computing platforms. They get into your system as a graphics card, but look how much faster it is now at all these other tasks.

That Intel may be trying very hard to mislead GPGPU makers into thinking they are going all out to be competitive in the graphics market, when all along, ever since the fostering of the idea of Larrabee, was to get a multicore CPU to perform, or outperform massive parallel tasks such as CUDA or Stream?

I don't think they have fooled AMD or nVidia for an instant, but they are going to try and fool the typical consumer in a way. Look at the entire design of Larrabee, it is flat out bad for a GPU. Using pre P2 cores bolted together with some modifications for running graphics? It can't be taken seriously as an attempt to enter into the GPU market, what it does do however, is put a x86 alternative to CUDA out there before CUDA gains too much traction. I think it could end up a mixed bag there too. Yes, Larrabee is x86 based which is going to help some, but it is also in order execution and relies heavily on vectorization for performance. In other words, almost no code base around is going to run fast on it without a rewrite anyway(anything old enough to be in order isn't going to be heavily vectorized, anything new enough to be heavily vectorized is unlikely to be in order).

This is one of the reasons I see nV as so utterly bullish going head to head with Intel on this one. They are shouting from the rooftops how badly they are going to whip Intel in the GPU market, because they know they are- and it isn't even going to be close. They don't go nearly as far when discussing ATi parts as they know they are going to always be competitive(even on ATi's 'bad' cycles, they are still in the ballpark, Intel won't be). nV is starting very early, and very confidently, to convince users that Intel can't make a GPU worth a damn- even though they know that isn't what Intel is really trying to do(at least in the near future). Because nV sells tens of millions of GPUs to gamers, they can sell parts to the HPC sector for a relative pittance when looking at the R&D to develop it versus the volume those markets push. If Intel honestly wanted to make the fastest GPU on the market, and put they same amount of money into it as they did in Larrabee and made it on the same build process(whch ATi/nV can't hope to compete with) they would at the very least be extremely competitive if not dominant, that isn't their goal. They want to use the GPU market to subsidize further development and to work as a potential defensive measure if things were to go horribly wrong for Intel moving forward(say, DX14 timeframe when any ARM CPU combined with a GPU can run any normal application far faster then a CPU.... maybe ).

I think it is safe to say there are likely a lot more people at Intel worried about CUDA/STREAM and Tegra then there are at AMD/nV about Larrabee. Maybe Intel will get serious with Larrabee2, or perhaps they will run back to their intergrated doghouse with their tail between their legs like the last time they wanted to play in this market. However it plays out, in relative terms I have no problem stating that Larrabee will be much faster in GPGPU measures then any gaming benchmarks.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: BenSkywalker
On a more pedestrian side: Does anyone have an idea at what resolution the demo was run? And assuming, for a moment, that ray tracing was ubiquitous as rasterization today - How would AMD/NV's offering fare at that ray tracing demo?

This is a bit of a loaded question. If nV/AMD ran that exact same demo right now their boards would suck running it, much as Larrabee does. The big difference is that Larrabee is built with nigh no compromises to run exactly that, while AMD/nV just would happen to be able to because of what their hardware is capable of, a huge portion of their die space would simple be idle while doing it. There are many, many reasons why ray tracing isn't a good choice for real time rendering- nVidia and ATi both know this quite well(dynamic geometry and diffuse being huge stumbling blocks for ray tracing, both key elements of games). Intel made a gamble that rasterizers would stop scaling, at which point ray tracing would start to catch up(you could have a million cores working on a ray traced scene with perfect scaling as long as the support sytems were in place). The problem for Intel, rasterization is scaling as fast as it ever has- so the rift is growing at a significant pace still, it will be a very long time before ray tracing comes close to being even with rasterization for real time(likely more then a decade at least- and that is just talking speed- in terms of feature parity it will take much longer).

If this was the motivation, versus say the motivation being a desire to diversify into alternative revenue sources such as the existing GPU marketspace, then it would make sense that their priorities are really to make sure the product can service the GPGPU industry and if it can be sold as a discreet GPU then that is just extra.

Don't get me wrong, I think Intel would love to own the GPU gaming market, and the console space as well with Larrabee, but it appears to be a rather distant secondary concern.

So do you feel that it's sort of a bait and switch?

I think they are using the PCI-E slot as a Trojan horse for their next progression of computing platforms. They get into your system as a graphics card, but look how much faster it is now at all these other tasks.

That Intel may be trying very hard to mislead GPGPU makers into thinking they are going all out to be competitive in the graphics market, when all along, ever since the fostering of the idea of Larrabee, was to get a multicore CPU to perform, or outperform massive parallel tasks such as CUDA or Stream?

I don't think they have fooled AMD or nVidia for an instant, but they are going to try and fool the typical consumer in a way. Look at the entire design of Larrabee, it is flat out bad for a GPU. Using pre P2 cores bolted together with some modifications for running graphics? It can't be taken seriously as an attempt to enter into the GPU market, what it does do however, is put a x86 alternative to CUDA out there before CUDA gains too much traction. I think it could end up a mixed bag there too. Yes, Larrabee is x86 based which is going to help some, but it is also in order execution and relies heavily on vectorization for performance. In other words, almost no code base around is going to run fast on it without a rewrite anyway(anything old enough to be in order isn't going to be heavily vectorized, anything new enough to be heavily vectorized is unlikely to be in order).

This is one of the reasons I see nV as so utterly bullish going head to head with Intel on this one. They are shouting from the rooftops how badly they are going to whip Intel in the GPU market, because they know they are- and it isn't even going to be close. They don't go nearly as far when discussing ATi parts as they know they are going to always be competitive(even on ATi's 'bad' cycles, they are still in the ballpark, Intel won't be). nV is starting very early, and very confidently, to convince users that Intel can't make a GPU worth a damn- even though they know that isn't what Intel is really trying to do(at least in the near future). Because nV sells tens of millions of GPUs to gamers, they can sell parts to the HPC sector for a relative pittance when looking at the R&D to develop it versus the volume those markets push. If Intel honestly wanted to make the fastest GPU on the market, and put they same amount of money into it as they did in Larrabee and made it on the same build process(whch ATi/nV can't hope to compete with) they would at the very least be extremely competitive if not dominant, that isn't their goal. They want to use the GPU market to subsidize further development and to work as a potential defensive measure if things were to go horribly wrong for Intel moving forward(say, DX14 timeframe when any ARM CPU combined with a GPU can run any normal application far faster then a CPU.... maybe ).

I think it is safe to say there are likely a lot more people at Intel worried about CUDA/STREAM and Tegra then there are at AMD/nV about Larrabee. Maybe Intel will get serious with Larrabee2, or perhaps they will run back to their intergrated doghouse with their tail between their legs like the last time they wanted to play in this market. However it plays out, in relative terms I have no problem stating that Larrabee will be much faster in GPGPU measures then any gaming benchmarks.

Very well stated BenSkywalker. My opinion on Larrabee and Intel has slowly evolved over the past 18 months, and I've got to admit that what you state above pretty much reflects my sentiments of it now. (because you wrote it down, now I can see it, and align my thinking to it, very rational, logical, reasonable...thanks!)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |