Intel will launch Larrabee

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: taltamir
Originally posted by: Aberforth
Originally posted by: taltamir
Originally posted by: MarcVenice
Look, the fastest single GPU is a gtx280, capable of 0.8tflops. Double that, and you have intels larabee videocard, supposedly ofcourse. Do we see gtx280's performance being doubled any time soon? A single GPU doing 2tflops? I don't, it'll take at least a year, most likely end of 2009. I'm discounting AMD here, because tflops don't mean everything. While capable of more tflops, most of the time gtx280's gpu > HD4870's gpu.

In 2010 intel might be able to move to a new production process, cramming more cores onto the same die. They might add more optimizations, and what not. It might be a long time, but it's also time that intel can use to poor 100's of millions into research.

Well, the larabee is 10 SEPERATE CPU cores.... its not dual gpu, its deca-cpu emulating a GPU.

Right...nv has 256 (or whatever the number these days) shader processors, as far as I know they are quite separated from each other

does the nv 256 have 10 seperate x86 decoders? and is each shader group a full x86 compatible core?

nv created them because MS came up with a Geometry shader pipe in DX10 because of that the architecture needed a redesign. But these processors do play a role in raw processing, since it doesn't have any proprietary instructions its not worth talking about anyway. Forced engineering is what they do like create massive power sucking gpus.

btw x86, x86-64 instructions can be easily licensed from Intel.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
easily? intel will NEVER license those out, or will require rediculous sums of money to do so.

Anyways... my whole point wasn't that 10 CPU cores is awesome or anything, but that not counting the 4870x2 because it is multi GPU and saying that the larabee is the highest FLOPS single GPU is inaccurate, because it is not a single GPU, it is 10 CPUs emulating a GPU.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: taltamir
easily? intel will NEVER license those out, or will require rediculous sums of money to do so.

Anyways... my whole point wasn't that 10 CPU cores is awesome or anything, but that not counting the 4870x2 because it is multi GPU and saying that the larabee is the highest FLOPS single GPU is inaccurate, because it is not a single GPU, it is 10 CPUs emulating a GPU.

I said the shader processors do play a role in raw processing- that doesn't count eh? x86 is a successful architecture- it's tried and tested, also used in nasa satellites, super computers etc - if its going to be implemented to process graphics then whats the big deal? Intel has been doing machine code for many years.

If you think of it as 10 CPU's then nv definetly has 256 cores without a definite standard. Yes, they came up with CUDA which was never a standard. In the graphics industry, there is always a constant flux- you have to keep redesigning whenever there is a change in the software side, nothing is proprietary or standard. Just a bunch of companies making money.

And Intel does license instructions, I think that's how AMD is making cpu's. Nobody is stopping nv from implementing x86.
 

nosfe

Senior member
Aug 8, 2007
424
0
0
what i don't get at all is why multiple graphics cards scale so poorly, i mean, they're inherently very parallel being made of hundreds of small cores/shaders/whatever so how come it all goes downhill when you add another gpu into the equation? it completely baffles me that it took so long for someone to come up with a system like lucid's hydra(assuming it actually works)
 

bamacre

Lifer
Jul 1, 2004
21,029
2
61
Originally posted by: Bateluer
I am very wary of Intel's Larrabee. If they are truly making a high performing product, then that's good. But after them effectively sabotaging PC gaming by flooding the market with anemic IGP solutions . . . lets just say I won't be dropping my ATI card right away.

WTF? IGP solutions are meant to deliver inexpensive video options to consumers who don't need anything more powerful.

Put the blame where it belong, uninformed customers who are too lazy to research before they buy.

Sticking a GTX 260 in every computer made isn't a very bright idea.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Aberforth
Originally posted by: taltamir
easily? intel will NEVER license those out, or will require rediculous sums of money to do so.

Anyways... my whole point wasn't that 10 CPU cores is awesome or anything, but that not counting the 4870x2 because it is multi GPU and saying that the larabee is the highest FLOPS single GPU is inaccurate, because it is not a single GPU, it is 10 CPUs emulating a GPU.

I said the shader processors do play a role in raw processing- that doesn't count eh? x86 is a successful architecture- it's tried and tested, also used in nasa satellites, super computers etc - if its going to be implemented to process graphics then whats the big deal? Intel has been doing machine code for many years.

If you think of it as 10 CPU's then nv definetly has 256 cores without a definite standard. Yes, they came up with CUDA which was never a standard. In the graphics industry, there is always a constant flux- you have to keep redesigning whenever there is a change in the software side, nothing is proprietary or standard. Just a bunch of companies making money.

And Intel does license instructions, I think that's how AMD is making cpu's. Nobody is stopping nv from implementing x86.

Dead wrong!

AMD and intel have ancient cross-licensing agreements

intel won't license to Nvidia [period!!!]

Nvidia has to buy Via to get SiS' x86 license or work around it

Larrabeast will be Intel's biggest FLOP ever
- my prediction

Intel's CPU engineers are the best in the world But their CPU engineers are clueless about graphics .. it takes a lot of time and to buy graphics engineers and do the research.
the way intel is approaching it is half-assed imo which leads me to believe it is more PR than real

eventually intel will probably get decent IG out of it
- not wasted research imo





 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AMD only has a license because they won it in court. (due to intel trying to comply with IBM, and later changing their minds)
http://en.wikipedia.org/wiki/A...d_the_x86_architecture
In February 1982, AMD signed a contract with Intel, becoming a licensed second-source manufacturer of 8086 and 8088 processors. IBM wanted to use the Intel 8088 in its IBM PC, but IBM's policy at the time was to require at least two sources for its chips. AMD later produced the Am286 under the same arrangement, but Intel canceled the agreement in 1986 and refused to convey technical details of the i386 part. AMD challenged Intel's decision to cancel the agreement and won in arbitration, but Intel disputed this decision. A long legal dispute followed, ending in 1994 when the Supreme Court of California sided with AMD. Subsequent legal disputes centered on whether AMD had legal rights to use derivatives of Intel's microcode. In the face of uncertainty, AMD was forced to develop "clean room" versions of Intel code.
In 1991, AMD released the Am386, its clone of the Intel 386 processor. It took less than a year for the company to sell a million units. Later, the Am486 was used by a number of large original equipment manufacturers, including Compaq, and proved popular. Another Am486-based product, the Am5x86, continued AMD's success as a low-price alternative. However, as product cycles shortened in the PC industry, the process of reverse engineering Intel's products became an ever less viable strategy for AMD.

The difference between the hundreds of shader cores and a CPU / GPU, is in the technical implementation. Does it present itself as an individual GPU / CPU by having the right set of unique controller hardware. Indeed shaders are hundreds of cores capable of doing calculations, but they are presented as one GPU per, well, gpu. There is redundant hardware in multi processor situations. It is actually a WASTE to have more GPU/CPU cores compared to having one with more calculation performaning coresm, because of that replication of central hardware. You keep on arguing as if I am saying that 10 CPUs is a GOOD thing. It is not, it is a weakness compared to 1 CPU with the performance of 10, you have wasted controllers and other silicone on it that has been replicated and gives no actual performance benefit, in other words, its OVERHEAD. But it is needed for various practical reasons. The fact nvidia and AMD can make a single GPU with multiple calculation performing shaders is an advantage to them. And even they are limited in how many they can make.

the GTX280 tries to cram twice the everything of the 9800GTX into one GPU. The 9800GX2 seperates it into two seperate cores, with reduntant overhead, and then uses software to split the data between them. same amount of stream processors, and various other internal components. But the "single GPU" implementation eliminates overhead and increases performance.
9800GX2 vs GTX280
 

kobymu

Senior member
Mar 21, 2005
576
0
0
Originally posted by: apoppin
Larrabeast will be Intel's biggest FLOP ever
- my prediction

Intel's CPU engineers are the best in the world But their CPU engineers are clueless about graphics .. it takes a lot of time and to buy graphics engineers and do the research.
the way intel is approaching it is half-assed imo which leads me to believe it is more PR than real

You just don't get it do you.

Larrabee IS A CPU! IT IS NOT A GPU!

There is a very good chance that Larrabee will lose to a 'regular' GPU in ActiveX. I don't think Intel is even going to try too hard to win there, try to keep up, maybe.

However, when, and it is 'when' not 'if', when the PC/console gaming industry will move (back) to SOFTWARE rendering engines ...

SOFTWARE rendering engines <---> CPU

CPU <---> SOFTWARE rendering engines

... if you still don't cant see the connection let me put it this way...

Intel doesn't need "graphics engineers", the graphics part would be the responsibility of the game/engine/middleware developers (and by developer I mean programmers). The only thing that Intel will 'need' to do (and they are already doing that in their 'normal' CPU line of product) is finding ways to make the next iteration of the CPU(!) that consume the SAME software, run that software faster.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: kobymu
Originally posted by: apoppin
Larrabeast will be Intel's biggest FLOP ever
- my prediction

Intel's CPU engineers are the best in the world But their CPU engineers are clueless about graphics .. it takes a lot of time and to buy graphics engineers and do the research.
the way intel is approaching it is half-assed imo which leads me to believe it is more PR than real

You just don't get it do you.

Larrabee IS A CPU! IT IS NOT A GPU!

There is a very good chance that Larrabee will lose to a 'regular' GPU in ActiveX. I don't think Intel is even going to try too hard to win there, try to keep up, maybe.

However, when, and it is 'when' not 'if', when the PC/console gaming industry will move (back) to SOFTWARE rendering engines ...

SOFTWARE rendering engines <---> CPU

CPU <---> SOFTWARE rendering engines

... if you still don't cant see the connection let me put it this way...

Intel doesn't need "graphics engineers", the graphics part would be the responsibility of the game/engine/middleware developers (and by developer I mean programmers). The only thing that Intel will 'need' to do (and they are already doing that in their 'normal' CPU line of product) is finding ways to make the next iteration of the CPU(!) that consume the SAME software, run that software faster.

actually i do get it



but there is no point in arguing with you at all
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: apoppin
Originally posted by: Aberforth
Originally posted by: taltamir
easily? intel will NEVER license those out, or will require rediculous sums of money to do so.

Anyways... my whole point wasn't that 10 CPU cores is awesome or anything, but that not counting the 4870x2 because it is multi GPU and saying that the larabee is the highest FLOPS single GPU is inaccurate, because it is not a single GPU, it is 10 CPUs emulating a GPU.

I said the shader processors do play a role in raw processing- that doesn't count eh? x86 is a successful architecture- it's tried and tested, also used in nasa satellites, super computers etc - if its going to be implemented to process graphics then whats the big deal? Intel has been doing machine code for many years.

If you think of it as 10 CPU's then nv definetly has 256 cores without a definite standard. Yes, they came up with CUDA which was never a standard. In the graphics industry, there is always a constant flux- you have to keep redesigning whenever there is a change in the software side, nothing is proprietary or standard. Just a bunch of companies making money.


Larrabeast will be Intel's biggest FLOP ever
- my prediction

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Originally posted by: kobymu

Is the "transistor count" the new GHz? Or the new IPC for that matter?
You tell me. On what basis was the original claim?.any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops. As long as the architecture is bloated- nothing can be done about the performance. Let's see how Larrabee performs... made?

nVidia has a 1.4 billion transistor 65 nm GPU that has the single card performance crown.

What does Intel have? Some crappy ray-tracing demos running at VGA resolutions at a slideshow, and a failed Itanium processor, most of whose transistor budget is devoted to the L3 cache.

Again if all transistors are ?equal? as was claimed then by that metric if I a ship a core that has nothing but caches can I clam this core is equivalent to any CPU or GPU?

I think not.

Or maybe they can't compete with nVidia and ATi directly with rasterization so they're trying to use marketing ploys and market share muscle to force developers into their back yard?

A manufacturing accomplishment ,maybe, but not necessarily a design accomplishment. And if that is the case, some of that "genius" belongs to TSMC.
Sure, TSMC played a part but I think you?re vastly downplaying nVidia's achievement here. The fact is Intel can't even do what they have pulled off with the GT200 (again cache transistors don?t count) and nVidia don?t even have their own fab like Intel does.

That argument is just utter nonsense!
How is it nonsense? Is designing a cache transistor the same as designing a computation one? Of course not. The cache is just dumb storage; it doesn?t do any work.

If a said CPU can get the job done faster by dedicating some of it's transistors count to do some "dumb" work (cache), then not doing so is plain old the wrong design decision. If you would take a modern CPU and remove it's cache completely and replace it with "smart" transistors (additional ALU or what have you) you will probably see your CPU performance go DOWN!
Right, and? Again I?ll ask if I shipped a 1 4 billion transistor die with nothing but cache (i.e. it is incapable of doing any kind of computation), is that the same as a CPU or GPU that does actual work and also has 1.4 billion transistors?

I mean let?s take it a step further. Can I count the VRAM transistors used in the 1 GB RAM that is included with the GTX280? Using your argument I can because if I removed the VRAM performance would go down. Right?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Originally posted by: kobymu

Larrabee IS A CPU! IT IS NOT A GPU!

There is a very good chance that Larrabee will lose to a 'regular' GPU in ActiveX. I don't think Intel is even going to try too hard to win there, try to keep up, maybe.

However, when, and it is 'when' not 'if', when the PC/console gaming industry will move (back) to SOFTWARE rendering engines ...

SOFTWARE rendering engines <---> CPU

CPU <---> SOFTWARE rendering engines
ActiveX? I think you mean DirectX.

And what you're saying will never happen. Current APIs would need to be dropped and the rendering style would need to be changed from rasterization to something else, probably ray tracing. That isn't going to happen, not as long Microsoft is shipping DirectX with Windows and OpenGL is the cross-platform standard and both APIs are raster based.

People have been claiming CPUs will replace GPUs for years but this is lunacy. Try running any ancient ten year old game like Quake, Quake 2 or Unreal that shipped with a software renderer on any modern quad-core CPU and see how piss-poor it runs compared to even budget GPUs. If anything the rift has gotten wider, not narrower.

I agree with Apoppin that Larrabee will flop but then I expect Intel will scale it down and include it onboard with their chipsets. It'll still be a failure from a performance and compatibility standpoint but as far as market penetration goes it'll be ?success? for Intel.

This is exactly where we are today with Intel?s GMA and Larrabee will be the next GMA I predict.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,996
126
Originally posted by: Aberforth

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx
Do you want Virge to lock your thread? If not I suggest you lay off the personal attacks.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: BFG10K
Originally posted by: Aberforth

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx
Do you want Virge to lock your thread? If not I suggest you lay off the personal attacks.

I wasn't attacking, I was requesting.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: BFG10K
Originally posted by: Aberforth

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx
Do you want Virge to lock your thread? If not I suggest you lay off the personal attacks.
Funny enough I was just catching up on this thread...

I'm vetoing the "post a worthless thread" idea right now, and I'm not going to put up with any more dumb suggestions or attacks. Play nice, kids.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Take it easy aberforth. Don't get the thread closed leave it. bookmark . Comeback to thread when it counts . When we have real world numbers. Than lay it at his feet.

But software rendering is the future clear as day !
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Nemesis 1
Take it easy aberforth. Don't get the thread closed leave it. bookmark . Comeback to thread when it counts . When we have real world numbers. Than lay it at his feet.

But software rendering is the future clear as day !

The future is never anything but cloudy, Nemesis. Unless you're talking death or taxes.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
So vary true keys. But! One has to take a few chances in life in order to stay ahead. Much like my recent trip to Mexico for treatments I can't get in the US . I took a chance on some facts . Because of that I feel way better today. Thank GOD.

The point is keys larrabee isn't the future keys. Thats not at all what I am saying. What larrabee represents is the future. Who knows what NV or ATI have planned for late 09 early 10. I have said this befor. The facts or Not . Us guys in these forums have an advantage we can see whats coming befor it arrives. That = $$$$$$. I seen the NV thing coming over a year ago but I waited until I was sure . Bingo . I got lucky on timing and the notebook news hits at same time I short the stock .

SO having a view of what the future holds is infact not fool proof. But it doesn't hurt keys to take a half full glass point of view. It could lead to you making a decision that changes your life one way or the other. I don't care who does it first I just want it here now. No more stupid upgrades every 6 months. Pure rubbish. The way the gpu companies been diming us to death. With hydra and software rendering thats about to all change . FOR the better.

Just for those who might want a view on how I would play my money.

First I lok at NV. They have a small quality control problem they resolved. But it hurt now. It doesn't hurt there future. What does hurt is their arch. THis could be their undoing.

Intel Larrabee is complete unknown that promises much . Can it deliver?

ATI sitting not talking looking to go multicore. There arch . is superior in every way . I doubt Intel can match their graopics performance in any area including RT.


The smart money would go on intel . But Iam betting on ATI in graphics and intel larrabee in supercomputing . SO my money is on AMD / ATI

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Aberforth
Originally posted by: BFG10K
Originally posted by: Aberforth

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx
Do you want Virge to lock your thread? If not I suggest you lay off the personal attacks.

I wasn't attacking, I was requesting.

it was a request that called me stupid for making a personal opinion and a prediction labeled as such

And when i am proved right, what will that make me when it comes to technological predictions?


i AM saying Larrabeast will not meet the rather extraordinary *predictions* Intel is making for it; that they are hyping to heck a future marchitecture - something that is not "real" yet.

i AM also saying that Intel's research WILL pay off - but only in their IG and maybe future - in 5-10 years - what they envision may come to pass; but in the meantime AMD and Nvidia will have moved also beyond them

Much like my recent trip to Mexico for treatments I can't get in the US . I took a chance on some facts . Because of that I feel way better today. Thank GOD.
My mother's friend is also heading to Mexicali for alternative treatment, September 5-25
- i wish you well


The smart money would go on intel . But Iam betting on ATI in graphics and intel larrabee in supercomputing . SO my money is on AMD / ATI
Mine is on Nvidia/AMD .. i simply do not believe intel's PR
- - - P4's 10Ghz Netbust *predictions* come to mind



 

kobymu

Senior member
Mar 21, 2005
576
0
0
Originally posted by: apoppin
but there is no point in arguing with you at all
:brokenheart: =*(


Originally posted by: BFG10K
You tell me. On what basis was the original claim?.any chip company is capable of making a over sized chip with 1.4bn transistors on it that can do 2 tflops. As long as the architecture is bloated- nothing can be done about the performance. Let's see how Larrabee performs... made?
If you asking me then the answer is "it's not about transistor count but what you do with it".

nVidia has a 1.4 billion transistor 65 nm GPU that has the single card performance crown.

nVidia has a product that that has an economically justification! the fact that it can sell a chip of that size is the important metric here. The fact that that product has the single card performance crown can be argued (not enough competition) but I'm not interested in playing the devil advocate.

Right now due to the nature of task at hand (rendering) and it's characteristics (extremely easy to do in parallel) there is a justification for a single massive piece of silicon. If that massive piece of silicon was one monolithic extraordinary complex chip I'm not sure nVidia would have been able to pull it off. The fact that that chip is made up from many identical units (SP) IS one of the more important metric here, it is one of the reasons that fabricating a 1.4 billion transistor chip is economically justified, designing 1.4 billion transistor worth of logic in a single (or only a few) big, complex processing unit(s) would have been an order of magnitude (if not a few) harder.

If Intel would have the economical justification to design and build such products I personally would be hard pressed to believe that they would have not done so, with their know-how of chip fabrication.

The fact that nVidia did it first (making a > billion transistor chip) or at all for that matter, is because nVidia had better reason to do it. it really is that simple.

Classical CPUs manufacturer and designer just don't have that much to gain from creaming a few dozens of 'cores' into a single piece of silicon, most of the code they execute is mostly sequential. and this is exactly where Larrabee is different from Classical CPUs, Larrabee is a CPU that specialize in executing parallel code.

What does Intel have?
A vision.

Again if all transistors are ?equal? as was claimed then by that metric if I a ship a core that has nothing but caches can I clam this core is equivalent to any CPU or GPU? I think not.
What?!
IF TRANSISTOR COUNT IS A METRIC, THEN ALL TRANSISTOR ARE EQUAL!
Again, if transistors count is a metric, stating that the job/function of the transistor is influential toward that count, it is exactly the point when transistors count as a metric is a no longer relevant. the metric becomes "the count of the transistors that does X".

And if you want to go that route... From a design perspective, if you are compering 'transistors count that does X' you should do it properly, apples to apples, and that would be processing unit to a processing unit (SM to a core2duo core)

Or maybe they can't compete with nVidia and ATi directly with rasterization so they're trying to use marketing ploys and market share muscle to force developers into their back yard?
Only one problem with that, at the end of the day these developers (or more precisely those who pay their salary) also need an economical justification to take that path, because that path will cost, a lot, at least at first. if it (moving to software rendering) is not economically justified nobody will take that route. Intel doesn't have enough "marketing ploys and market share muscle to force" other to take a path that will make them lose money.

Sure, TSMC played a part but I think you?re vastly downplaying nVidia's achievement here. The fact is Intel can't even do what they have pulled off with the GT200 (again cache transistors don?t count) and nVidia don?t even have their own fab like Intel does.
...
How is it nonsense? Is designing a cache transistor the same as designing a computation one? Of course not. The cache is just dumb storage; it doesn?t do any work.
...
Right, and? Again I?ll ask if I shipped a 1 4 billion transistor die with nothing but cache (i.e. it is incapable of doing any kind of computation), is that the same as a CPU or GPU that does actual work and also has 1.4 billion transistors?

I mean let?s take it a step further. Can I count the VRAM transistors used in the 1 GB RAM that is included with the GTX280? Using your argument I can because if I removed the VRAM performance would go down. Right?
See above.

ActiveX? I think you mean DirectX.
*scratches head* Arr.. yeah.

/many edits for many typos and bad grammar
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: apoppin
Originally posted by: Aberforth
Originally posted by: BFG10K
Originally posted by: Aberforth

If you are wrong, will you please make a topic admitting that you totally wrong and that you are the dumbest being in existence when it comes to making technology predictions. Thnx
Do you want Virge to lock your thread? If not I suggest you lay off the personal attacks.

I wasn't attacking, I was requesting.

it was a request that called me stupid for making a personal opinion and a prediction labeled as such

And when i am proved right, what will that make me when it comes to technological predictions?


i AM saying Larrabeast will not meet the rather extraordinary *predictions* Intel is making for it; that they are hyping to heck a future marchitecture - something that is not "real" yet.

i AM also saying that Intel's research WILL pay off - but only in their IG and maybe future - in 5-10 years - what they envision may come to pass; but in the meantime AMD and Nvidia will have moved also beyond them

Much like my recent trip to Mexico for treatments I can't get in the US . I took a chance on some facts . Because of that I feel way better today. Thank GOD.
My mother's friend is also heading to Mexicali for alternative treatment, September 5-25
- i wish you well


The smart money would go on intel . But Iam betting on ATI in graphics and intel larrabee in supercomputing . SO my money is on AMD / ATI
Mine is on Nvidia/AMD .. i simply do not believe intel's PR
- - - P4's 10Ghz Netbust *predictions* come to mind

I just repling to your last statement. The Pre intel Merom I might agree to a point. But Since merom timeframe . I trust what intel says. They have been very good . Better than all others combined I would say. So if ya go by recent history . I would go against what your saying. Until intel fails to deliver .

 

nosfe

Senior member
Aug 8, 2007
424
0
0
yes, but what will they deliver exactly? creative sound cards would be the best case scenario for Larabee, great hardware, crappy drivers; i'm not saying this because of their IGP drivers but because they have a ton of games to cover and also because they can't start working on the drivers properly until later as hardware specs are still not 100% set in stone
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Originally posted by: apoppin
- intel talks the talk ,, but we saw the P4 walk; intel is capable of massive failure also

This is a grossly ignorant myth. P4 was not a failure at all, except in the enthusiast community after AMD64 hit the streets.

P4 Willamette was a slow start, and RDRAM didn't gel quite right, everything was too expensive and only the bleeding-edge types bought it, for little to no benefit over something like an Athlon Thunderbird on a decent board (hard to find a decent Socket A board back then though, before the NForce2 days).

P4 Northwood started a period of performance superiority for Intel that lasted a pretty long time. Northwood-A processors were roughly equivalent to their AMD branded counterparts (eg; a P4 2.0A was ~= to AXP2000+), then Northwood-B came out, and would slightly edge their counterparts (eg; P4 2.4B would be slightly faster than AXP2400+), and then the Northwood-C series started running away with the game at the end (eg; P4 2.8C would often outperform AXP3200+ in encoding/games, and the 3.2C was pretty dominant in everything). Not helping AMD's case was the ridiculous overclockability of the Northwood chips. With the AMD chips, a couple of gems stood out, some assorted mobile AXPs, and the Barton 2500+, which was a pretty reliable 3200+ stand-in. But it was a pretty long run of success here for Intel even from an enthusiast perspective. From the market perspective, Intel just piled profit upon profit.

P4 Prescott was the beginning of the end of P4 architecture. EVEN SO, there were many benchmarks where the P4 remained competitive, particularly encoding. AMD launched their finest hour around this time with the Athlon64 stuff, but people forget that there wasn't a huge performance delta at this time. A Prescott 3.4Ghz was not noticeably slower than an Athlon64 3400+, although the 3400+ was hands-down a better product (cooler, great NV chipsets, excellent memory performance, great FPU, etc). Whatever the case, Intel still sold gobs of chips, and continued to enjoy massive profits. The smart buyers and enthusiasts were buying AMD64 boxes, but the unwashed masses continued to buy Intel inside for the most part.

Pentium D, Presler, etc, all were placeholders that moved performance slightly upward and offered new SKUs until Core2 was ready to hit.

Overall, P4 was a HUGE hit for Intel. Just look at the profits they made, and the long period of time in which a Northwood box was often the fastest thing you could build.

If you want to look at Intel failures, look at their stumbling around with Itanium, and with server offerings in general. They've always dominated desktops, and it looks likely to continue, but the server/backoffice environment is quite a different story. And of course i740 was no home run either lol.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
what do you think about this interview with lead UE designer?
http://arstechnica.com/article...-sweeney-interview.ars

10 years ago he estimated that the rendering he will be back to the CPU circa 2006-2007. He was a bit off, but that is exactly what is happening with larabee and nvidia's next (after G200) GPU.
He says UE4 will be coded in C++.

And he has ONLY good things to say about it, saying that DX and OGL are extremely restrictive and wasteful, but were necessary, in the past.

I mean, think about it... larabee is a 2 TFLOPS CPU... with SOME backwards compatibility using software emulation of DX etc...
The penryn quad core is a CPU with mere 0.04 TFLOPS. we are talking about 50 times the performance. (FLOPS = Floating point operations per second).
This 50 times the amount of operations per second was achieved by making more cores, removing cache, and removing superfluous hardware decoders(MMX, SSE 1-4, etc); and replacing them with more and more ALU units (of the FPU variety). And from what I read, a full HALF of the space is dedicated to x86 decoding hardware.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |