Intel will launch Larrabee

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nosfe

Senior member
Aug 8, 2007
424
0
0
well i didn't read it all but from what i've seen he wasn't talking about software rendering but of using CUDA/Brook+/OpenCL/etc instead of DirectX/OpenGL/etc
 

thilanliyan

Lifer
Jun 21, 2005
11,944
2,175
126
Originally posted by: nosfe
well i didn't read it all but from what i've seen he wasn't talking about software rendering but of using CUDA/Brook+/OpenCL/etc instead of DirectX/OpenGL/etc

I think he did mention coding games on C++.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Arkaign
Originally posted by: apoppin
- intel talks the talk ,, but we saw the P4 walk; intel is capable of massive failure also

This is a grossly ignorant myth. P4 was not a failure at all, except in the enthusiast community after AMD64 hit the streets.

P4 Willamette was a slow start, and RDRAM didn't gel quite right, everything was too expensive and only the bleeding-edge types bought it, for little to no benefit over something like an Athlon Thunderbird on a decent board (hard to find a decent Socket A board back then though, before the NForce2 days).

P4 Northwood started a period of performance superiority for Intel that lasted a pretty long time. Northwood-A processors were roughly equivalent to their AMD branded counterparts (eg; a P4 2.0A was ~= to AXP2000+), then Northwood-B came out, and would slightly edge their counterparts (eg; P4 2.4B would be slightly faster than AXP2400+), and then the Northwood-C series started running away with the game at the end (eg; P4 2.8C would often outperform AXP3200+ in encoding/games, and the 3.2C was pretty dominant in everything). Not helping AMD's case was the ridiculous overclockability of the Northwood chips. With the AMD chips, a couple of gems stood out, some assorted mobile AXPs, and the Barton 2500+, which was a pretty reliable 3200+ stand-in. But it was a pretty long run of success here for Intel even from an enthusiast perspective. From the market perspective, Intel just piled profit upon profit.

P4 Prescott was the beginning of the end of P4 architecture. EVEN SO, there were many benchmarks where the P4 remained competitive, particularly encoding. AMD launched their finest hour around this time with the Athlon64 stuff, but people forget that there wasn't a huge performance delta at this time. A Prescott 3.4Ghz was not noticeably slower than an Athlon64 3400+, although the 3400+ was hands-down a better product (cooler, great NV chipsets, excellent memory performance, great FPU, etc). Whatever the case, Intel still sold gobs of chips, and continued to enjoy massive profits. The smart buyers and enthusiasts were buying AMD64 boxes, but the unwashed masses continued to buy Intel inside for the most part.

Pentium D, Presler, etc, all were placeholders that moved performance slightly upward and offered new SKUs until Core2 was ready to hit.

Overall, P4 was a HUGE hit for Intel. Just look at the profits they made, and the long period of time in which a Northwood box was often the fastest thing you could build.

If you want to look at Intel failures, look at their stumbling around with Itanium, and with server offerings in general. They've always dominated desktops, and it looks likely to continue, but the server/backoffice environment is quite a different story. And of course i740 was no home run either lol.

it was a *total failure* - an EMBARRASSMENT to intel who DUMPED it after failing to hit their impossible targets and predictions about it

it ended with the disaster of a PressHot CPU which was much crappier than the decent Northwood which preceded it.
- and i am not talking about sales to their gullible public and their OEMs

Intel proclaimed that they would easily see 10Ghz out of their Netbust architecture

they started with a bang and ended with a whimper and a complete shift to their 2nd line - the PentiumM which became the foundation for C2D

btw, the same Netbust P4 engineers are reused for Larrabeast and the very same P4-PR people praising it
- i rest my case


maybe ten years .. Sweeny is a dreamer .. he is hoping to get out from under MS' thumb .. it ain't so easy to buck an entrenched industry where his OWN engine allows devs to do EASY ports and build entire games very quickly.
- he will have to figure out a way to compete with himself

intel is Big.. but they have BSed us in the past about crap than never became standard - at times the entire industry aligns against them
--Rambust anyone?
:roll:

Larrabeast has the impossible task of lugging x86 overhead along for ap that must be written for it to work really well
- let them show something more than smoke and mirrors if they are only a year off

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I like how he compares a 2.4Ghz to a AXP2400+... yea, the A 2400 was called that way because AMD was saying it outperforms the 2.4ghz P4... The entire athlon XP line was named after what intel CPU they outperform despite being half to 2/3 the clockrate and price.
The intial P4 were SLOWER clock per clock then the P3...

So enthusiasts got the Athlong XP or the P3 before it, the power conscious people bought the Athlong XP, the bang per buck bought the athlon XP... the people who wanted the fastest stock performance bought an athlon XP.
And the only people buying a P4 were suckers who didn't know any better.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: taltamir
please, do try reading it, and you will see what he is actually talking about.


Read this . The whole overview. I will post 1 quote.


http://www.projectoffset.com/technology.php After reading this reread other article.


For Programmers - Programmers will appreciate the straightforward layout of Offset's cross-platform code base. The C++ code is clearly separated into sections for the core engine, renderer, editor, game, and generic libraries. Additionally, all resource files are specified in easy to read XML format, making it easy to track down bugs caused by bad assets.


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ok, i see a sales pitch for a graphics engine, its not a review btw, it is a about a million bullet points. AND I actually went through them. (almost all actually), what does this have to do with what exactly?
 

nosfe

Senior member
Aug 8, 2007
424
0
0
i read it all and still don't see where he states that the GPU's are history
TS: From my point of view, the ideal software layer is just to have a vectorizing C++ compiler for every architecture?NVIDIA, Intel, AMD, whoever. Let us write code in C++ to run on the GPU

JS: It sounds like, instead of the standard CPU plus GPU configuration, we may just have many-core CPUs... or, sorry, not many-core general-purpose CPUs[...]
TS: No, I see exactly where you're heading.

please point out to me where he says that its the end of the GPU's; besides the articles title, all i'm seeing is the death of DirectX/OpenGL and the current way of programing for GPU's and a lot of talk about making C++(CUDA/Brook+/OpenCL are all based on C) run on GPU's. Just because he says "software rendering" doesn't mean that it'll run on the CPU, technically speaking "software" cannot render anything, its hardware rendering all the way, it just depends on what hardware will render it. The thing is that CPU's and GPU's are different beasts, suited for different types of workloads, raster rendering is what GPU's were created for so guess which of the two is better suited for the task

What he is talking at the end is Fusion type processors but with high end graphics chips in them but thats way way in the future, unless there's some breakthrough in getting rid of the processors heat. Don't forget that he doesn't know hardware engineering, he doesn't know what's feasible with the current and near future tech, he just writes code, thats why he is talking about the coding aspect of the graphics cards and daydreams about a beautiful world where all he needs is C++ or some language based on it to do his work.
Problem is that the real world doesn't quite work that way, just look at web browsers, they still haven't gotten all together to make all the browsers render the same page the same way, its still full of "tweaks" needed for different browsers and there aren't any hardware restrictions there unlike in the graphics world
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: taltamir
ok, i see a sales pitch for a graphics engine, its not a review btw, it is a about a million bullet points. AND I actually went through them. (almost all actually), what does this have to do with what exactly?

I just found it interesting that its done in C++ Keep inmind this was the engine befor intel bought the company. Much of the overview really doesn't mean anything to larrabbee but to ATI DX10.1 it does. Intel larrabbee software render should allow project offset game to increases its capabilities 2x of what we know about the engine and the game. Add in Havak physics this looks like what the article was talking about. Its exciting stuff . But That ATI cinemia stuff is really really cool>. Thats the kinda stuff that really gets me excited to see that realizm .

I remember the first time I played the fear demoe. It actually gave me a rush. Had that game been done with the rendering techs underway. It would of literally scared the shit out of me. But it was a 1 time thrill.

With the kind of graphics wear talking the screen should immerse you. Thats what Iwant to see. Remember the First time you see I max . The experiance doesn't change each viewing is the same intense visuals. Thats whats cool.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Nemesis 1
Originally posted by: taltamir
ok, i see a sales pitch for a graphics engine, its not a review btw, it is a about a million bullet points. AND I actually went through them. (almost all actually), what does this have to do with what exactly?

I just found it interesting that its done in C++ Keep inmind this was the engine befor intel bought the company. Much of the overview really doesn't mean anything to larrabbee but to ATI DX10.1 it does. Intel larrabbee software render should allow project offset game to increases its capabilities 2x of what we know about the engine and the game. Add in Havak physics this looks like what the article was talking about. Its exciting stuff . But That ATI cinemia stuff is really really cool>. Thats the kinda stuff that really gets me excited to see that realizm .

I remember the first time I played the fear demoe. It actually gave me a rush. Had that game been done with the rendering techs underway. It would of literally scared the shit out of me. But it was a 1 time thrill.

With the kind of graphics wear talking the screen should immerse you. Thats what Iwant to see. Remember the First time you see I max . The experiance doesn't change each viewing is the same intense visuals. Thats whats cool.

What is the difference to programming C++ in CUDA and using PhysX?
--it looks like nvidia is also heading for "fully programmable" GPUs - what you want to do on Larrabeast next year, you can evidently do *now* with Tesla and CUDA; even with AMD's new SDK.

.. or am i missing some facts?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: nosfe
i read it all and still don't see where he states that the GPU's are history
TS: From my point of view, the ideal software layer is just to have a vectorizing C++ compiler for every architecture?NVIDIA, Intel, AMD, whoever. Let us write code in C++ to run on the GPU

JS: It sounds like, instead of the standard CPU plus GPU configuration, we may just have many-core CPUs... or, sorry, not many-core general-purpose CPUs[...]
TS: No, I see exactly where you're heading.

please point out to me where he says that its the end of the GPU's; besides the articles title, all i'm seeing is the death of DirectX/OpenGL and the current way of programing for GPU's and a lot of talk about making C++(CUDA/Brook+/OpenCL are all based on C) run on GPU's. Just because he says "software rendering" doesn't mean that it'll run on the CPU, technically speaking "software" cannot render anything, its hardware rendering all the way, it just depends on what hardware will render it. The thing is that CPU's and GPU's are different beasts, suited for different types of workloads, raster rendering is what GPU's were created for so guess which of the two is better suited for the task

What he is talking at the end is Fusion type processors but with high end graphics chips in them but thats way way in the future, unless there's some breakthrough in getting rid of the processors heat. Don't forget that he doesn't know hardware engineering, he doesn't know what's feasible with the current and near future tech, he just writes code, thats why he is talking about the coding aspect of the graphics cards and daydreams about a beautiful world where all he needs is C++ or some language based on it to do his work.
Problem is that the real world doesn't quite work that way, just look at web browsers, they still haven't gotten all together to make all the browsers render the same page the same way, its still full of "tweaks" needed for different browsers and there aren't any hardware restrictions there unlike in the graphics world

well, what he is talking about is a new generation of processor units that are neither GPU or CPU, but descent from both. They are no longer central, there are many of them, they are not dedicated to graphics, they are general. Instead it is more like APU, an array of processing units. That can processor general code and execute it, rather then doing specialized actions
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: apoppin
Originally posted by: Nemesis 1
Originally posted by: taltamir
ok, i see a sales pitch for a graphics engine, its not a review btw, it is a about a million bullet points. AND I actually went through them. (almost all actually), what does this have to do with what exactly?

I just found it interesting that its done in C++ Keep inmind this was the engine befor intel bought the company. Much of the overview really doesn't mean anything to larrabbee but to ATI DX10.1 it does. Intel larrabbee software render should allow project offset game to increases its capabilities 2x of what we know about the engine and the game. Add in Havak physics this looks like what the article was talking about. Its exciting stuff . But That ATI cinemia stuff is really really cool>. Thats the kinda stuff that really gets me excited to see that realizm .

I remember the first time I played the fear demoe. It actually gave me a rush. Had that game been done with the rendering techs underway. It would of literally scared the shit out of me. But it was a 1 time thrill.

With the kind of graphics wear talking the screen should immerse you. Thats what Iwant to see. Remember the First time you see I max . The experiance doesn't change each viewing is the same intense visuals. Thats whats cool.

What is the difference to programming C++ in CUDA and using PhysX?
--it looks like nvidia is also heading for "fully programmable" GPUs - what you want to do on Larrabeast next year, you can evidently do *now* with Tesla and CUDA; even with AMD's new SDK.

.. or am i missing some facts?

@ Nemesis.... oooh, Now i get it. I didn't realize that intel was behind that engine... This is basically a larabee engine trying to compete with directX / OGL. Very interesting.

@ Apoppin: physX is a special, constrained, physics engine, you tell it what you have (wood, water, etc), and it performs physics calculations on it the way it wants and spits out results. CUDA and x86 processing is taking C++ code and running it. It could be ANYTHING, it COULD be custom physics code, it could be shader code, it could be regular rendering, 2d rendering, scientific calculations, video decoding/encoding, etc... everything is done in generic C++ code that runs on the new APU (array of processing units, you heard it here first ), that is a hybrid descendant of the CPU and GPU.
 

her34

Senior member
Dec 4, 2004
581
1
81
Even if larrabee performs worse than nvidia at games, intel will win by default.

Intel will eventually add a few larrabee cores to their cpu's, thus they will win market share over cuda, thus they will win developer support for non-game applications, thus they will win mainstream for add-in card. Nvidia will be relegated to high end which will slowly fade away.

That's the long-term way. The short-term way is to outperform nvidia at games which may be possible given intel's fabrication advantage.

The only way nvidia can continue is to drastically outperform intel, as happened with geforce vs gma. Geforce vs larrabee will not be so disparate.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well.... indications are that larabee will actually be fighting nvidias next gen, not G200... and that this next gen will be even more generalized computing then CUDA.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |