Physics Card ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

knyghtbyte

Senior member
Oct 20, 2004
918
1
0
ok, heres a simple reason why a PPU would make a large portion of gamers happy...

when HL2 first came out, how many people spent a good 5-10 minutes throwing trash at the guards and fellow citizens in the train station where you first have control of gordon? go on, admit it, you did didnt you? I know i did, i was wetting myself.......

or when you first got on to CS:S, how much fun do you have shooting barrels and melons around......

ok, thats the fun aspect.....but now look at the application side of it, on de_dust, bombsite A, shooting the barrels to around where you place the bomb makes it awkward for someone to get near them and defuse in time, in Prodigy, shoot the chairs to cover your bomb, on office use filing cabinets to stop the CTs getting near the hossies easily.........in HL2 obviously you can use various pieces of furniture/scenery/props to help you....

its simple, the greater the physics interaction with the world around you, the more enjoyable your gaming experience, and more variable. But to make as much of the world around you as usable as possible needs more than what the CPU can do...simple.
 

linkgoron

Platinum Member
Mar 9, 2005
2,395
969
136
Originally posted by: vss1980
It will never get integrated into a CPU - a PPU would require its own bit of memory and probably a fair bit of it if it has to store a particularly large environment packed full of different objects all with different properties attached.

For example, take a game like Serious Sam where the levels in that were huge - if we started modelling every element in the game to the point of where every building was constructed of blocks we could destory until it fell down, a whole lot of memory will be needed. With a CPU the memory would either have to be on-chip which is unlikely or use the system memory which eventually means that an integrated PPU on a CPU would be of little benefit especially in this era of SMP on a single chip.

The graphics card idea actually holds a bit of promise. For example, as DirectX ramps up in versions so do the graphics architectures. If the physics requirements go up also then as you upgrade a graphics card from say a DX10 level to a DX12 level you get the improvements in physics also. Also graphics cards usually have a lot of RAM on anyway so a PPU with its own little bit of memory isn't gonna make a massive difference in some cases especially if it doesn't need lightning fast RAM - it could just take up a small space o have the PPU chip itself and a single memory chip (after all high-density memory is not too hard to get anymore). I'm sure ATI or nVidia can be persuaded to add a chip to the boards if it proves even mildly popular or beneficial. Besides, it wouldn't be the first time we've seen 3D cards with extra chips on offering sound output, etc., all on the same board.
Integrating the PPU into the graphics chip would be good but then you have 2 things using up the graphics RAM bandwidth which would hurt performance so that wouldn't be in their interest.

So basically you want a PPU to take bandwidth from the GPU, to take memory from the GPU, and make more heat in one place (on Graphics card, which means faster/more fans), a card that will need more power and more PCI-E lanes?
If they make a PPU they should make it stand alone with 128 or 256 MB ram, and fanless.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I can't understand why anyone who enjoys technology wouldn't want this. CPU's are more of a general purpose processor. I don't see them being able to match the physics processing power of these from Ageia based on pre-production specs in the next 10 years even. That's why a PPU is a good thing. It's specialized hardware just like a GPU. If you can't see the similarity between GPU vs. CPU and PPU vs. CPU then you don't know enough about them and should do a little more research before saying you don't want the technology. Having a PPU means having destructable objects... and I'm not talking about a glass bottle that "breaks" into two broken shards of glass like in HL2. I'm talking about a brick wall that's actually made of individual bricks... not a single model with a texture on it that makes it look like bricks. Actual individual bricks that make up the wall... and being able to blow it up, and break those bricks into even smaller pieces. Another area people don't usually consider is fluid simulation. Imagine in games, actually interacting with the water, not just a rippling flat texture with some pixel shading and some puffs of white stuff that's supposed to be a splash or mist... you jump in the water and ACTUALLY make a splash that gets the dock wet. Then you get out of the water and the water actually drips off you.

Games are not the only thing that would benefit from a PPU... think of the weather. With more physics processing power, the weather can be more accurately predicted through simulation.
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
I'm all up for a PPU if it'll allow a significant improvement in physics.
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Originally posted by: seanp789
Bet on soldier will be the first game released to support the physics chip. I wouldnt expect anything revolutionary until unreal 3 engine at the soonest. WTf ever happened to STALKER?

Any game that supports the physics chip will have the benefit of being able to take a major load off the CPU. Now the real question is, how long will it take developers to deliver content that can only bee seen with the physics chip.

http://www.gamestop.com/product.asp?product%5Fid=645658
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Couple of points.
1) Much more physics processing ability should have more impact on game play then even nicer graphics. It should lead to things like properly destructable enviroments, which are currently a long way off (even unreal 3 only has *scripted* destruction - lots of stuff still can't be damaged - it's not like a real world battle where by the end you've flattened most of the town you were fighting in).
2) Physics processing will end up requiring more cpu power - just like graphics cards can keep a cpu busy trying to feed the graphics card new stuff to draw the physics processor will keep a cpu busy giving it new physics calculations to do. Hence the more cores the better really.
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
Originally posted by: linkgoron
Originally posted by: vss1980
It will never get integrated into a CPU - a PPU would require its own bit of memory and probably a fair bit of it if it has to store a particularly large environment packed full of different objects all with different properties attached.

For example, take a game like Serious Sam where the levels in that were huge - if we started modelling every element in the game to the point of where every building was constructed of blocks we could destory until it fell down, a whole lot of memory will be needed. With a CPU the memory would either have to be on-chip which is unlikely or use the system memory which eventually means that an integrated PPU on a CPU would be of little benefit especially in this era of SMP on a single chip.

The graphics card idea actually holds a bit of promise. For example, as DirectX ramps up in versions so do the graphics architectures. If the physics requirements go up also then as you upgrade a graphics card from say a DX10 level to a DX12 level you get the improvements in physics also. Also graphics cards usually have a lot of RAM on anyway so a PPU with its own little bit of memory isn't gonna make a massive difference in some cases especially if it doesn't need lightning fast RAM - it could just take up a small space o have the PPU chip itself and a single memory chip (after all high-density memory is not too hard to get anymore). I'm sure ATI or nVidia can be persuaded to add a chip to the boards if it proves even mildly popular or beneficial. Besides, it wouldn't be the first time we've seen 3D cards with extra chips on offering sound output, etc., all on the same board.
Integrating the PPU into the graphics chip would be good but then you have 2 things using up the graphics RAM bandwidth which would hurt performance so that wouldn't be in their interest.

So basically you want a PPU to take bandwidth from the GPU, to take memory from the GPU, and make more heat in one place (on Graphics card, which means faster/more fans), a card that will need more power and more PCI-E lanes?
If they make a PPU they should make it stand alone with 128 or 256 MB ram, and fanless.

Did you actually read my whole post?? I did reason that it wouldn't be a good idea to have the PPU integrated in the graphics chip or trying to steal RAM bandwidth from the GPU. I just said why not add it to the graphics board - we've seen plenty of graphics boards with extra stuff on. OK, it may mean a warmer board / hot spot in the computer but I doubt it would be a chip that would generate that much temperature - after all some of the most complex DSP/processor chips can be made to run fairly cool, e.g. the Pentium M chips are quite powerful but comparatively ice cool - there is no reason why a PPU of the same complexity can't run as cool.

I do agree that if the chip has big requirements it will need to stand-alone and that maybe there should be a stand-alone version anyway, but if it has modest requirements and only outputs as much heat as a low clocked Celeron M, only needs 32-64MB RAM and is fairly power efficient then it could quite happily co-exist on a graphics card utilising its own seperate memory chip. So far not much has been said about how meaty the PPU will be or how many resources it would need.

There is of course one main reason why I would reason to combine it with a graphics card - such a PPU would be practically aimed at gaming and little else (although practically speaking it may have several possible uses such as distributed computing, encoding/decoding, etc.... - it may even allow a manufacturer to offload certain things off of the GPU to save die space aswell as add extra features). Anyway, as it is so linked to gaming then it would be reasonable to tie it in with the only other component that makes a massive difference to today's gaming... graphics cards.
 

PAPutzback

Junior Member
Dec 12, 2003
10
0
0
People keep saying that the PPU is going to make graphics look nicer. This is not what the purpose is or the output. It will make graphics act more realistic. You can light a tank on fire now and it will the code can blow it up after a certain amount of time but if you throw in phyisic parts on the tank can melt, cause the surrounding trees to catch fire and the fire to even burn the grass the tank is rolling over. Rain could extinquish it.

The chip would also have instructions specific to crunching phyisics code. Not the nearly the amount of instructions that a cpu has. Think about cpus running at 3-4 ghz and Gpus running at less than a 1 ghz. Yet nobobdy uses the cpu to draw the games even thought they are 4 times faster if you go by the ghz. This is the reason for specialization of the processors. Future ppus might have seperate processors for specific environments, handling rain falling into puddles or the waves caused by a boat in the water. Then the effect of bullets on brick, mtal and wood. How cool would it be to shoot a hole thru a bunker wall to add a new window.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I do not really want a world where we will have a: polygon card, AI card, physics card, shader card, effects card, bass card, treble card ect...


Do you see where I am going? That is where we are heading. They are going to make a processor for every little thing.

10 years from now, the term "word processor" will not be relating to software.
 

fishbits

Senior member
Apr 18, 2005
286
0
0
On the silly side: The physics card could calculate just how hard it is for a bunny-hopping soldier to actually hit someone while carrying 50lbs of gear over rough terrain as fast as a man can run, and how many consecutive hops before his knee blows out.

Oh, and advancement in the field of jiggles
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: dguy6789
I do not really want a world where we will have a: polygon card, AI card, physics card, shader card, effects card, bass card, treble card ect...


Do you see where I am going? That is where we are heading. They are going to make a processor for every little thing.

10 years from now, the term "word processor" will not be relating to software.

Of course, because that is more effecient use. If you can make 20 CPUs to 20 very specific tasks and then have one mother CPU watching over them... Well, the performance would be jaw dropping. This is the future of computing IMO... I know some people hate it, but I welcome it, since it means the advancement of computing in general.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: dguy6789
I do not really want a world where we will have a: polygon card, AI card, physics card, shader card, effects card, bass card, treble card ect...


Do you see where I am going? That is where we are heading. They are going to make a processor for every little thing.

10 years from now, the term "word processor" will not be relating to software.

When technology catches up to the point where it's more efficient to combine things, then they will. But until then, it's more efficient to have them separate. Just like when computers had math coprocessors because the regular processor couldn't handle floating point numbers well enough. Then when technology advanced enough, it became possible to build that in to the CPU. This is no different. When technology advances enough, the PPU will likely be interated into the GPU. But that time is not here yet... the die size would likely be too large. So this is the time for separate, specialized hardware so we can enjoy having more realistic physics calculated in real time.
 

jb20thae

Member
Jul 26, 2005
133
0
0
Like the above poster said, this reminds me of the Math Coprocessors. This sounds like the kind of stuff consoles are building in, I don't think current CPU architecture is ideal for such processing (like Cell processors?).
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: jb20thae
Like the above poster said, this reminds me of the Math Coprocessors. This sounds like the kind of stuff consoles are building in, I don't think current CPU architecture is ideal for such processing (like Cell processors?).

As I said, CPU's are more of a general purpose processor. They're supposed to do anything you throw at them pretty well. However, if you build a more specialized processor, it will do one thing very well, and other things not at all, or pretty poorly. A perfect example is the Itanium. It does what it does extremely well. Try to run x86 code on it, and it falls flat on it's face. A GPU running as a CPU would suck too. A 16 pipeline 6800 GPU is almost like a 16 core processor. Even if it could run Windows, it would do so very poorly since running an OS doesn't require large amounts of parallel processing.
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
Actually maths co-processors weren't really necessary for a vast majority of users in the early days of computing. Hell, even running windows normally and using word/excel/etc. should invoke few FPU calls if any at all - it's all integer maths.
FPU's were introduced of course because they could do maths calcs much faster and more accurately but also because there was a need for them in applications - I wouldn't have said it was technology catching up, more the opposite and that the requirement caught up with what technology could do and what was demanded. In the early IBM AT days for example, there wasn't a need - it was all word processing, spreadsheets and payrolls. By the time of the 386 with more graphics and industrial PC usage there of course was a need.

I remember when my cousin was at university.
I just got a brand spanking new 486DX 33MHz (fast in the day) which of course had its on-chip FPU. He was doing aerodynamics calculations and running simulation software on his PC which was taking an age (386DX 40MHz) and he was most happy when he came home for a while and was able to use the 486 which more than halved the simulation time to less than an hour instead of several hours.

SSE/SIMD didn't just come about because it was good for gamers - lots of stuff uses it from graphics to Windows Media player.

PPU has a different problem though - which is why I never see it being integrated into a CPU or other chip...... it has no real effective uses apart from physics calculations. Hardly a massive call for that at the moment or in the near future. Unless of course it's abilities can be made use of in another field of interest.

On a side note I have actually thought about a couple of things a PPU could help with apart from game physics........ sound output and simulation work.
It would be the perfect chip to program in an environment and then maybe/hopefully calculate sound effects such as reflections/distortions, etc.
In terms of simulators a PPU or the use of several could mean even better flight simulators or for sports fans racing teams could have even better suspension simulation work back at the factory whilst the car is still at the track and help resolve set-up issues etc.

The problem with the simulations idea is that these are fairly specialised fields - hardly the sort of thing that will benefit the general user.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Also, dont forget that even when discrete hardware is faster in some situation, it might be slower in other situations. Like HW T&L on the geforce 1 or 2 - it was fast when there was only 1 light in the scene, but when you had 8 lights, the software T&L on a fast cpu was actually significantly faster.

As long as you were not running any code at all. The difference with hardware T&L is that your processor could be @100% load and then throw a couple million polys with multiple lights and not take a huge performance hit.

In realistic terms- a PPU will be required for ports from the consoles. The X360's processor slaughters anything in the x86 realm for physics processing and that is crushed by Cell. The highest performing multi core CPU available right now for PCs is very weak in comparison to what is going to be needed to keep up with the set top boxes. A high end multi core CPU paired with a PPU is comparable to a PS3 in terms of running code(according to Ageia).
 

Chaotic42

Lifer
Jun 15, 2001
33,932
1,113
126
My question is, will non-game applications be able to take advantage of the physics card? Matlab? 3D Studio Max? Distributed Computing?
 

biostud

Lifer
Feb 27, 2003
18,598
5,299
136
Originally posted by: vss1980

PPU has a different problem though - which is why I never see it being integrated into a CPU or other chip...... it has no real effective uses apart from physics calculations. Hardly a massive call for that at the moment or in the near future. Unless of course it's abilities can be made use of in another field of interest.

On a side note I have actually thought about a couple of things a PPU could help with apart from game physics........ sound output and simulation work.
It would be the perfect chip to program in an environment and then maybe/hopefully calculate sound effects such as reflections/distortions, etc.
In terms of simulators a PPU or the use of several could mean even better flight simulators or for sports fans racing teams could have even better suspension simulation work back at the factory whilst the car is still at the track and help resolve set-up issues etc.

you can get 3D Studio Max plugins for the physX.

click
 

linkgoron

Platinum Member
Mar 9, 2005
2,395
969
136
Originally posted by: vss1980
Originally posted by: linkgoron
Originally posted by: vss1980
It will never get integrated into a CPU - a PPU would require its own bit of memory and probably a fair bit of it if it has to store a particularly large environment packed full of different objects all with different properties attached.

For example, take a game like Serious Sam where the levels in that were huge - if we started modelling every element in the game to the point of where every building was constructed of blocks we could destory until it fell down, a whole lot of memory will be needed. With a CPU the memory would either have to be on-chip which is unlikely or use the system memory which eventually means that an integrated PPU on a CPU would be of little benefit especially in this era of SMP on a single chip.

The graphics card idea actually holds a bit of promise. For example, as DirectX ramps up in versions so do the graphics architectures. If the physics requirements go up also then as you upgrade a graphics card from say a DX10 level to a DX12 level you get the improvements in physics also. Also graphics cards usually have a lot of RAM on anyway so a PPU with its own little bit of memory isn't gonna make a massive difference in some cases especially if it doesn't need lightning fast RAM - it could just take up a small space o have the PPU chip itself and a single memory chip (after all high-density memory is not too hard to get anymore). I'm sure ATI or nVidia can be persuaded to add a chip to the boards if it proves even mildly popular or beneficial. Besides, it wouldn't be the first time we've seen 3D cards with extra chips on offering sound output, etc., all on the same board.
Integrating the PPU into the graphics chip would be good but then you have 2 things using up the graphics RAM bandwidth which would hurt performance so that wouldn't be in their interest.

So basically you want a PPU to take bandwidth from the GPU, to take memory from the GPU, and make more heat in one place (on Graphics card, which means faster/more fans), a card that will need more power and more PCI-E lanes?
If they make a PPU they should make it stand alone with 128 or 256 MB ram, and fanless.

Did you actually read my whole post?? I did reason that it wouldn't be a good idea to have the PPU integrated in the graphics chip or trying to steal RAM bandwidth from the GPU. I just said why not add it to the graphics board - we've seen plenty of graphics boards with extra stuff on. OK, it may mean a warmer board / hot spot in the computer but I doubt it would be a chip that would generate that much temperature - after all some of the most complex DSP/processor chips can be made to run fairly cool, e.g. the Pentium M chips are quite powerful but comparatively ice cool - there is no reason why a PPU of the same complexity can't run as cool.

I do agree that if the chip has big requirements it will need to stand-alone and that maybe there should be a stand-alone version anyway, but if it has modest requirements and only outputs as much heat as a low clocked Celeron M, only needs 32-64MB RAM and is fairly power efficient then it could quite happily co-exist on a graphics card utilising its own seperate memory chip. So far not much has been said about how meaty the PPU will be or how many resources it would need.

There is of course one main reason why I would reason to combine it with a graphics card - such a PPU would be practically aimed at gaming and little else (although practically speaking it may have several possible uses such as distributed computing, encoding/decoding, etc.... - it may even allow a manufacturer to offload certain things off of the GPU to save die space aswell as add extra features). Anyway, as it is so linked to gaming then it would be reasonable to tie it in with the only other component that makes a massive difference to today's gaming... graphics cards.


But youre forgeting that even if the PPU's temp will be 20C-40C, it will add up to the GPUs 60C/70C/80C/whatever, and will cause heat problems, even if the PPU won't need a fan by itself adding it to the GPU (now at least) will force GPU makers to have Dual-slot cooling for every card...
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
Yep I am aware of this - there a lot of if's and buts......
I just don't like the idea of having yet another card to achieve something - especially something so linked to changing specifications like DirectX... it would be good to have it on something that changes also like the graphics card.

I have thought of another possible way of implementing it.
Back in the good old 486 days, even the 486DX's which had their own FPU unit still had the option of an additional co-processor from companies like Wietek. As AMD have their quite useful Hypertransport bus which offers quite a bit of flexibility, just maybe some mobo makers could put a small socket for a PPU chip to plug into with direct high bandwidth access to the hypertransport bus shared with the main CPU (or more if SMP).

As long as the hypertransport protocol stays roughly the same the socket type wouldn't need to change for years and could also allow upgradability. Add in some embedded DRAM onto the chip for processing whilst the 'physics world' is stored in normal RAM and bob's your uncle.

Intel platform's could make use of the chipset hub architecture they use (unless Intel keep on using the same bus spec for beyond the P4).

Pie in the sky maybe, but do-able I reckon.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: linkgoron
Originally posted by: vss1980
Originally posted by: linkgoron
Originally posted by: vss1980
It will never get integrated into a CPU - a PPU would require its own bit of memory and probably a fair bit of it if it has to store a particularly large environment packed full of different objects all with different properties attached.

For example, take a game like Serious Sam where the levels in that were huge - if we started modelling every element in the game to the point of where every building was constructed of blocks we could destory until it fell down, a whole lot of memory will be needed. With a CPU the memory would either have to be on-chip which is unlikely or use the system memory which eventually means that an integrated PPU on a CPU would be of little benefit especially in this era of SMP on a single chip.

The graphics card idea actually holds a bit of promise. For example, as DirectX ramps up in versions so do the graphics architectures. If the physics requirements go up also then as you upgrade a graphics card from say a DX10 level to a DX12 level you get the improvements in physics also. Also graphics cards usually have a lot of RAM on anyway so a PPU with its own little bit of memory isn't gonna make a massive difference in some cases especially if it doesn't need lightning fast RAM - it could just take up a small space o have the PPU chip itself and a single memory chip (after all high-density memory is not too hard to get anymore). I'm sure ATI or nVidia can be persuaded to add a chip to the boards if it proves even mildly popular or beneficial. Besides, it wouldn't be the first time we've seen 3D cards with extra chips on offering sound output, etc., all on the same board.
Integrating the PPU into the graphics chip would be good but then you have 2 things using up the graphics RAM bandwidth which would hurt performance so that wouldn't be in their interest.

So basically you want a PPU to take bandwidth from the GPU, to take memory from the GPU, and make more heat in one place (on Graphics card, which means faster/more fans), a card that will need more power and more PCI-E lanes?
If they make a PPU they should make it stand alone with 128 or 256 MB ram, and fanless.

Did you actually read my whole post?? I did reason that it wouldn't be a good idea to have the PPU integrated in the graphics chip or trying to steal RAM bandwidth from the GPU. I just said why not add it to the graphics board - we've seen plenty of graphics boards with extra stuff on. OK, it may mean a warmer board / hot spot in the computer but I doubt it would be a chip that would generate that much temperature - after all some of the most complex DSP/processor chips can be made to run fairly cool, e.g. the Pentium M chips are quite powerful but comparatively ice cool - there is no reason why a PPU of the same complexity can't run as cool.

I do agree that if the chip has big requirements it will need to stand-alone and that maybe there should be a stand-alone version anyway, but if it has modest requirements and only outputs as much heat as a low clocked Celeron M, only needs 32-64MB RAM and is fairly power efficient then it could quite happily co-exist on a graphics card utilising its own seperate memory chip. So far not much has been said about how meaty the PPU will be or how many resources it would need.

There is of course one main reason why I would reason to combine it with a graphics card - such a PPU would be practically aimed at gaming and little else (although practically speaking it may have several possible uses such as distributed computing, encoding/decoding, etc.... - it may even allow a manufacturer to offload certain things off of the GPU to save die space aswell as add extra features). Anyway, as it is so linked to gaming then it would be reasonable to tie it in with the only other component that makes a massive difference to today's gaming... graphics cards.


But youre forgeting that even if the PPU's temp will be 20C-40C, it will add up to the GPUs 60C/70C/80C/whatever, and will cause heat problems, even if the PPU won't need a fan by itself adding it to the GPU (now at least) will force GPU makers to have Dual-slot cooling for every card...

I am not sure we can say that it will cause heat problems. It could, but I would not say that is definite. Were are not real sure of its future implimentation at this point. But if it did have a dual slot cooler, it would still be better than having to install an additional card, IMO.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: vss1980
Actually maths co-processors weren't really necessary for a vast majority of users in the early days of computing. Hell, even running windows normally and using word/excel/etc. should invoke few FPU calls if any at all - it's all integer maths.
FPU's were introduced of course because they could do maths calcs much faster and more accurately but also because there was a need for them in applications - I wouldn't have said it was technology catching up, more the opposite and that the requirement caught up with what technology could do and what was demanded. In the early IBM AT days for example, there wasn't a need - it was all word processing, spreadsheets and payrolls. By the time of the 386 with more graphics and industrial PC usage there of course was a need.

I remember when my cousin was at university.
I just got a brand spanking new 486DX 33MHz (fast in the day) which of course had its on-chip FPU. He was doing aerodynamics calculations and running simulation software on his PC which was taking an age (386DX 40MHz) and he was most happy when he came home for a while and was able to use the 486 which more than halved the simulation time to less than an hour instead of several hours.

SSE/SIMD didn't just come about because it was good for gamers - lots of stuff uses it from graphics to Windows Media player.

PPU has a different problem though - which is why I never see it being integrated into a CPU or other chip...... it has no real effective uses apart from physics calculations. Hardly a massive call for that at the moment or in the near future. Unless of course it's abilities can be made use of in another field of interest.

On a side note I have actually thought about a couple of things a PPU could help with apart from game physics........ sound output and simulation work.
It would be the perfect chip to program in an environment and then maybe/hopefully calculate sound effects such as reflections/distortions, etc.
In terms of simulators a PPU or the use of several could mean even better flight simulators or for sports fans racing teams could have even better suspension simulation work back at the factory whilst the car is still at the track and help resolve set-up issues etc.

The problem with the simulations idea is that these are fairly specialised fields - hardly the sort of thing that will benefit the general user.

The problem with that is you're assuming the application of PC's in the home won't change over time. When the math coprocessor was introduced, who would have thought it would ever be a part of daily computer life? You said yourself, "In the early IBM AT days for example, there wasn't a need - it was all word processing, spreadsheets and payrolls." Then you say, "SSE/SIMD didn't just come about because it was good for gamers - lots of stuff uses it from graphics to Windows Media player." May I also point out that for the longest time, 3D graphics were for games, and rendering/animation. Now take a look at OSX and Windows Vista. A GPU has gone from being a toy for 3D gamers and a tool for graphic designers and programmers to a required piece of hardware by the operating system for the average Joe.

I know you're not one of the types that's saying you don't need it and don't want it. But to those that are... are you so narrow minded and short sighted that you can't see that this is another revolutionary technological addition to the PC? Think simulated hair... individual strands, all with gravity acting on them and acting against eachother. Think fluid dynamics... simulated water, simulated weather, the ability to better simulate designs for internal combusion engines... with the ability to simulate the air and the atomized fuel and oil and the bearings and parts inside the engine.

If all you're thinking about is, "wow, I can drive a truck through an old shack," I can see how you wouldn't be that impressed or excited about the technology. But the technology is much MUCH more than that.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |