Intel will launch Larrabee

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
I am glad there is another player in this industry, more the better. Larrabee is capable of 2 TFLOPS and uses x86 parallel processing, do you guys think it will beat AMD and NV gpus? I hope it does

Intel will release its discrete graphics processor codenamed "Larrabee". In an industry that's pretty-much governed by NVIDIA and AMD, Intel looks for its slice of the pie. The Larrabee incorporates parallelism brought about by x86 processing elements. It is not far sighted to assume that Larrabee will face stiff competition from both NVIDIA and AMD, with both having graphics technologies two generations advanced from what they are now. NVIDIA already raised questions about the credibility of Larrabee in a 2010 setting. On Intel's side, it has to show the industry what its new product is capable of these would be using technology demos and games that would exploit Larrabee in other words, unleash the potential of the product. One such title is being developed by Offset, a game production house acquired by Intel this February last having access to its game developer staff and its own game engine.

The first game, based on that engine titled Project Offset is in the works. A video leaked on YouTube earlier showed its gameplay and is said to have received praise for its visuals. As far as implementation of this engine in the production of other titles that serve as "launch vehicles" for Larrabee go, it's known that the engine has been licensed by Red5 Studios. For now, concept art of the engine's capabilities has been released

http://www.techpowerup.com/img/08-09-06/12a.jpg
http://www.techpowerup.com/img/08-09-06/12b.jpg

Gameplay (project offset): http://youtube.com/watch?v=rRgKMso1rGA


Source: http://www.techpowerup.com/index.php?70679
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
From the vid on Youtube it looks very 'meh' nothing too exciting- nice graphics I guess.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
it looks like it will suck
[my deepest personal feelings vindicated ]

http://news.cnet.com/8301-13924_3-10039973-64.html


Analyst: Intel's Larrabee chip needs reality check


... the discrete graphics market ... is dominated by Nvidia and AMD's ATI graphics chip unit. Both companies supply chips that easily rival--or best--any Intel chip in complexity. Nvidia's latest chip, the GTX 280, boasts 1.4 billion transistors and 240 stream processors. In short, it is an extremely complex parallel-computing engine.

"Intel claims Larrabee is a graphics engine intended to outperform Nvidia's GPU offerings. The audacity of this claim is startling," according to a report issued by Ashok Kumar, an analyst at investment bank Collins Stewart. "Nvidia has had over 10 years to optimize the 3D graphics pipeline, the necessary drivers, the platform connections needed to supply the memory bandwidth required, and to work with the software and apps developers," he writes. (Note: Kumar started coverage of Intel at Collins Stewart on September 4 with a "buy" rating.)
. . .

Kumar claims that the task is extremely daunting. "Two of Intel's main challenges at present are the GPU threat by Nvidia and the heavy lifting required for software to make use of multicore processors. With the upcoming Larrabee chip, Intel has chosen to undertake a frontal assault on both of these problems simultaneously," Kumar writes.

Drawing on the Pentium heritage
The report also claims that one of Larrabee's purported strengths--its x86 heritage that taps into a large existing software infrastructure--is a weakness, too, because it is based on the Pentium, a design Intel launched in 1993. "Larrabee proposes to compete by fielding a couple of dozen x86 cores on a single chip. Each core, by dint of its (Pentium) heritage...is carrying all of the baggage of the full x86 instruction set," Kumar writes, adding that the Pentium design is "antiquated."

CNET Blog Network contributor Peter Glaskowsky wrote in August that "the power consumption of a 32-core design with all the extra overhead required by x86 processing would be very high."

Code would not necessarily execute efficiently either, according to Kumar. "The vast majority of x86 code in the world today was not compiled to optimize execution on a Pentium-class core...and will suffer the same old slowdowns," he said.

But Intel's Larry Seiler, principal engineer at Intel's Visual Computing Group, says there's a method behind the madness. At a session in August, he described the original Larrabee experiment. "They replaced the modern out-of-order core with a core derived from the Pentium design. So it's a much smaller core. It's in-order. It doesn't have the advanced capabilities (of more modern processors). But what they found is that they could fit ten cores...in the space of two (modern cores)," he said.

So, Larrabee's Pentium-derived design "has five times as many cores; each core has a vector (processor) unit that is four times as wide. So for throughput computing, potentially, it can run 20 times faster," Seiler said.

This was the key idea that made the original group that came up with the idea of Larrabee realize that it could design an architecture built out of CPU components that achieves the kind of performance and parallelism that previously had been the domain of graphics processors, Seiler said.

Kumar will be surprised if Intel pulls it off. "If Larrabee ends up knocking out Nvidia, it will be a shocking upset, considering how much inertia in the software industry is present, the degree of difficulty many-core chips present, and the high efficiency of Nvidia's existing designs," Kumar writes.

Intel's Gelsinger relishes the challenge. "We've not been bashful about saying we want to win Nehalem," Gelsinger said. Nehalem is Intel's next-generation chip architecture that will roll out over the next 12 months. "(Larrabee) will plug into Nehalem, and into Westmere, and into Sandy Bridge," he said, referring to future Intel chip platforms. "And volume consumer applications as well," he said.

"It's cool to watch somebody step up to the plate and point to the fences like Babe Ruth. Just remember: Babe Ruth also struck out a lot," Kumar concludes in his report.

 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I am very wary of Intel's Larrabee. If they are truly making a high performing product, then that's good. But after them effectively sabotaging PC gaming by flooding the market with anemic IGP solutions . . . lets just say I won't be dropping my ATI card right away.
 

emilyek

Senior member
Mar 1, 2005
511
0
0
Watch the hi-def Offset videos and tech demos if you can find them; they are impressive-- and they were very, very impressive when they were first shown, over 3 years ago, now, and running on what is now midrange hardware.

Edit:

http://www.projectoffset.com/videos.php

The situation with Intel is interesting, since the Offset developers had made an engine that looked fabulous and could do some really neat stuff at very low hardware cost-- but now Intel has taken the game and is apparently making them tune it for Larrabee.

Weird, in a way, especially since this has delayed the game significantly; and it moreover makes one wonder if the Offset team's achievement wont be diminished by the Intel association.

Larrabee, Offset and Intel's PC Gaming Alliance will probably make a three-headed media blitz whenever this thing is released.

And it bodes well for PC gaming; superior new hardware + an explicit focus on PC gaming + talented indie game developers + Intel's unlimited resources.

I like the sound of it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: emilyek
Watch the hi-def Offset videos and tech demos if you can find them; they are impressive-- and they were very, very impressive when they were first shown, over 3 years ago, now, and running on what is now midrange hardware.

Edit:

http://www.projectoffset.com/videos.php

The situation with Intel is interesting, since the Offset developers had made an engine that looked fabulous and could do some really neat stuff at very low hardware cost-- but now Intel has taken the game and is apparently making them tune it for Larrabee.

Weird, in a way, especially since this has delayed the game significantly; and it moreover makes one wonder if the Offset team's achievement wont be diminished by the Intel association.

Larrabee, Offset and Intel's PC Gaming Alliance will probably make a three-headed media blitz whenever this thing is released.

And it bodes well for PC gaming; superior new hardware + an explicit focus on PC gaming + talented indie game developers + Intel's unlimited resources.

I like the sound of it.

i think it is all fake

it is P4's NetBust all over again

Maybe in 5-10 years they will have something competitive to today's graphics - by then Nvidia will be light years ahead of them




intel is "all CPU" . . . that is what their engineers do. They have crap IG and no clue "how" to do graphics.
- the rest is PR and marketing

my opinion
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Don't underestimate Intel, they can invest billions and do forced research to compete. There were some people who thought Pentium 4 and Pentium D marked the end of intel- everyone used to buy AMD FX 60 series but now its totally opposite.

Same thing might happen in the graphics industry, personally I welcome all new players. Intel, S3, IBM...etc.


 

razor2025

Diamond Member
May 24, 2002
3,010
0
71
Larrabee's structure is interesting. I'm not sure if their idea flies though. Sure GPUs are massively parallel architecture and thus useful for throughput computing (as evidenced by surge of new software to utilize their power i.e. F@H, Cuda, etc). However, paralleling multiple in-order CPU core might not yield the same results.

Kumar's point about x86 instructions are also a concern. I'm not expert in GPU architecture, but aren't they much more "slimmer" in instruction wise? I hope Intel's slim-diet plan on their cores are enough to cut out the excess fat in traditional x86 designs.

Also as pointed out in AT's article:
Article
We've already shown that AMD's architecture requires a lot of help from the compiler to properly schedule and maximize the utilization of its execution resources within one of its 5-wide SPs, with Larrabee the importance of the compiler is tremendous. Luckily for Larrabee, some of the best (if not the best) compilers are made by Intel. If anyone could get away with this sort of an architecture, it's Intel.

Intel's compiler will need to compensate for the much more numerous parallel processing units. ATI's design is already have issues with keeping up with 5, Intel needs to write a compiler that does the same (or better) for 16+ processing unit.

On a side note, it's kind of interesting how ATI and Intel is going in a similar design path. Take a simple processing unit (SP/Core) and group them together. Integrate a number of these groups into a single unit. Write a compiler to utilize them as best as they can. This could be an evidence that process technology aren't growing as fast as it use to, and that monolithic processor designs are becoming harder to justify. It will also allow for easy scaling of products by simply reducing the number of processing units without any real re-design needed to the GPU. No wonder people are scared for nVidia.
 

emilyek

Senior member
Mar 1, 2005
511
0
0
Originally posted by: apoppin

Maybe in 5-10 years they will have something competitive to today's graphics - by then Nvidia will be light years ahead of them

my opinion

It's possible. There was a recent article that estimated Larrabee would only be on par with current hardware.

But, I dunno; if Intel is serious and not just screwing around, which it appears they are, it seems like they would have a pretty good shot at creating a Core2 sort of dominance of the GPU market.

Competition is good, whatever happens.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: razor2025
Larrabee's structure is interesting. I'm not sure if their idea flies though. Sure GPUs are massively parallel architecture and thus useful for throughput computing (as evidenced by surge of new software to utilize their power i.e. F@H, Cuda, etc). However, paralleling multiple in-order CPU core might not yield the same results.

Kumar's point about x86 instructions are also a concern. I'm not expert in GPU architecture, but aren't they much more "slimmer" in instruction wise? I hope Intel's slim-diet plan on their cores are enough to cut out the excess fat in traditional x86 designs.

Also as pointed out in AT's article:
Article
We've already shown that AMD's architecture requires a lot of help from the compiler to properly schedule and maximize the utilization of its execution resources within one of its 5-wide SPs, with Larrabee the importance of the compiler is tremendous. Luckily for Larrabee, some of the best (if not the best) compilers are made by Intel. If anyone could get away with this sort of an architecture, it's Intel.

Intel's compiler will need to compensate for the much more numerous parallel processing units. ATI's design is already have issues with keeping up with 5, Intel needs to write a compiler that does the same (or better) for 16+ processing unit.

On a side note, it's kind of interesting how ATI and Intel is going in a similar design path. Take a simple processing unit (SP/Core) and group them together. Integrate a number of these groups into a single unit. Write a compiler to utilize them as best as they can. This could be an evidence that process technology aren't growing as fast as it use to, and that monolithic processor designs are becoming harder to justify. It will also allow for easy scaling of products by simply reducing the number of processing units without any real re-design needed to the GPU. No wonder people are scared for nVidia.
Sony did it first commercially, wwith stringing cores together with their Cell processor in the PS3. And don't be afraid for Nvidia .. they are taking CUDA in a similar direction. They want to have "Nvidia Inside" *everything* that CAN use a GPU. They want to be indispensable to the pro as well as gaming market and have created many new uses for CUDA in the commercial and home market also. Nvision08 was an impressive gathering of their many partners that support them in this.

My issue is that x86 is a LOT of "excess baggage" for Larrabeast to handle. And intel has zero clue about Graphics as evidenced by their IG; show me a current intel IG solution that even stands close to AMD IG
- intel talks the talk ,, but we saw the P4 walk; intel is capable of massive failure also

It will depend on their compiler .. that is why i say 5-10 years

. . . and it IS good to see intel pushing AMD and Nvidia to excel in graphics. They got Nvidia off their butt and into pushing CUDA.

http://ati.amd.com/technology/...omputing/sdkdwnld.html

^AMD is working on their own SDK^
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
You're a bit pessimistic apoppin. I'm going to wait and see, it sure is a huge adventure for Intel, but who knows, perhaps theyll pull a rabbit out of their mighy hat. Even if they don't outperform nvidia, they might offer a very good bang for buck midrange videocard. With all these resources being poored into research, their igp's might get a lot better too. Imagine, HD4850 performance for 75$ ? Scaling should be SUPER good, so if they have a sli/cf equivalent it could become even more interesting.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MarcVenice
You're a bit pessimistic apoppin. I'm going to wait and see, it sure is a huge adventure for Intel, but who knows, perhaps theyll pull a rabbit out of their mighy hat. Even if they don't outperform nvidia, they might offer a very good bang for buck midrange videocard. With all these resources being poored into research, their igp's might get a lot better too. Imagine, HD4850 performance for 75$ ? Scaling should be SUPER good, so if they have a sli/cf equivalent it could become even more interesting.

actually i am optimistic

Intel's impending Larrabeast failure will drive *both* AMD and Nvidia to greater heights


yes, it will be a rabbit .. good improvement in intel's IG
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
If i remember correctly, Those 10 chips will have half their space dedicated to their x86 instruction sets...

Anyways, 2 TFLOPS is very impressive. considering the the 9800GTX is 0.6 TFLOPS, the G280 is 0.9 TFLOPS and the 4870 is 1.2 TFLOPS. (a modern quad core? under 40 GIGA FLOPS)
But, intel is emulating other functions on that hardware that AMD and nVidia does with specialized processors.
Only time will tell how thise whole thing will end up. But AMD has been working in the same direction with fusion.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I like the whole concept of Larrabee, and the fact that I don't need to care about DX10.1, OPenGL, specialized this and that since Larrabee can do all that in a simple driver update, no need to have your hardware 'locked in' to supporting a limited instruction set. Additionally, Larrabee has the potential to be huge for Linux- with Intels good Open source record and the fact that Larrabee can emulate all games in the software (talking mainly DX)- and indeed the games running Intel's Larrabee speicifc language should all be compatible and run perfectly regardless of OS, if that is the case then goodbye Windows for me. In the AT article they also hinted at linear scaling, I like that- need more performance? It is just a simple matter of adding more cores (perhaps bodes well for a better multi-GPU solution than we have now). It all sounds too good to be true, but Intel's recent record is giving me some hope.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Scoop
After their SSD release, I think Intel can do anything Sure their IGP sucks but so does AMD's and NVidia's.

You got to be joking. The difference between intel's IGP to AMD/nV is quite literally night and day. They provide alot of features that intel lacks (video features, HW accelerations, acceptable performance), not to mention the shoddy drivers for intel IGPs that even getting to the game menu without crashing is a miracle. Anyway enough OT -

Larabee is a 2010 product. 2TFLOPs is impressive but this is already accomplished by R700 i believe and its out already. My predictions still stay the same. They are going to lose the first round and bleed money while their at it. Similiar is ways like MS and the XBOX.

I also cant stop imagining at the thought of just how powerful AMD/nV GPUs (R900/G400 respectively) will be by 2010.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Look, the fastest single GPU is a gtx280, capable of 0.8tflops. Double that, and you have intels larabee videocard, supposedly ofcourse. Do we see gtx280's performance being doubled any time soon? A single GPU doing 2tflops? I don't, it'll take at least a year, most likely end of 2009. I'm discounting AMD here, because tflops don't mean everything. While capable of more tflops, most of the time gtx280's gpu > HD4870's gpu.

In 2010 intel might be able to move to a new production process, cramming more cores onto the same die. They might add more optimizations, and what not. It might be a long time, but it's also time that intel can use to poor 100's of millions into research.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops. As long as the architecture is bloated- nothing can be done about the performance. Let's see how Larrabee performs...
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Aberforth
Any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops. As long as the architecture is bloated- nothing can be done about the performance. Let's see how Larrabee performs...

Even S3?
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Cookie Monster
Originally posted by: Aberforth
Any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops. As long as the architecture is bloated- nothing can be done about the performance. Let's see how Larrabee performs...

Even S3?

well, S3 was my favorite when I used to play Monkey Island. They are low on funds i guess...
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Originally posted by: Aberforth

Any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops.
No, they really aren't. Anything that complex will outright fail unless you really know what you're doing. nVidia?s chip is vastly more complex than Intel have ever made but they don?t even have their own manufacturing facilities.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |