R580 Details - Foxconn already has design kits for ATI's R580 socket

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I think they will just incorporate and extra PPU chip (maybe an upgradable socket) on future gens. I think it will have to be standardized with the Ageia chip to avoid monopolistic lawsuits. Just a guess.
 
Jan 24, 2005
168
0
0
I think that is the right idea. Integrate the ppu chip on the graphics board (or motherboard), give it a few mb's of dedecated memory and call it good.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: apoppin
Originally posted by: keysplayr2003
Originally posted by: apoppin
yeah . . . it CAN be done :Q

:thumbsup:

!

That's what "could" means, Gamingphreek . . . not "maybe"
:roll:

Maybe you should simmer down a bit appopin. You are adding more to the article than there really was. You always take the "Wait and see" attitude. What's different today?
Respectfully,
Keys.

I would love to see the Ageia PhysX chip reviewed. I heard we may see something in December.
Agea's PPU will be available in '06 - Q2

and why 'simmer down'? . . . you NEVER tell a fellow nVidian to 'simmer down'
:roll:

i am adding NOTHING to the article but my opinion
[as usual]

and i do NOT "always take the "Wait and see" attitude" . . . something 'exciting' CAN be done on 520 . . . nice, but i do not expect to actually SEE it till r580 or r600 as drivers take a LONG time [upwards of a year if x-fire is an indication]


There MUST be a secret code somewhere is this garbled text. *searches*

All this simmering has me hungry for bacon.

let us know what you find


I had no idea you were not neutral. Always thought you were. My mistake.

Not neutral?
:Q

neutral? . . . how so?
:shocked:

[now you've really got me curious]


IF you mean i prefer ATI over Agea . . . you are quite correct, Sir!

 

Velk

Senior member
Jul 29, 2004
734
0
0
Originally posted by: Lonyo
Originally posted by: DeathReborn
Originally posted by: apoppin
yeah . . . it CAN be done :Q

:thumbsup:

!

That's what "could" means, Gamingphreek . . . not "maybe"
:roll:

If it's a choice between 60fps solid and 40fps with a little bit of added Physics i'd choose 60fps every time. I personally am not willing to lose performance to calculate Physics on a GPU, if you are then you're something else.
If you're losing fps by doing physics on the graphics card, then it means you are GPU bound not CPU bound, which means the CPU could be doing more physics anyway so you don't need to offload onto the GPU....


I think you missed the 'bit of added Physics' part of the text you quoted. I.e., he would prefer to have the added physics features turned off rather than slowing down the GPU to include them.

Much in the same way that some people turn off anti-aliasing.

In any case, I think the suggestion that the R520 cards can do both graphics work and physics work as fast or faster than a graphics AND a physics card combined is absurd.

That it can do graphics work at the same speed whether it is simulatenously doing physics or not is implausible, if not outright impossible.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Velk
Originally posted by: Lonyo
Originally posted by: DeathReborn
Originally posted by: apoppin
yeah . . . it CAN be done :Q

:thumbsup:

!

That's what "could" means, Gamingphreek . . . not "maybe"
:roll:


If it's a choice between 60fps solid and 40fps with a little bit of added Physics i'd choose 60fps every time. I personally am not willing to lose performance to calculate Physics on a GPU, if you are then you're something else.
If you're losing fps by doing physics on the graphics card, then it means you are GPU bound not CPU bound, which means the CPU could be doing more physics anyway so you don't need to offload onto the GPU....


I think you missed the 'bit of added Physics' part of the text you quoted. I.e., he would prefer to have the added physics features turned off rather than slowing down the GPU to include them.

Much in the same way that some people turn off anti-aliasing.

In any case, I think the suggestion that the R520 cards can do both graphics work and physics work as fast or faster than a graphics AND a physics card combined is absurd.

That it can do graphics work at the same speed whether it is simulatenously doing physics or not is implausible, if not outright impossible.
then what does this mean?

However, the indication we had was that ATI could actually do physics calculations on the card with the graphics processing simultaniously -- the bandwidth is already there
sounds good to me . . . and without a doubt r580's bandwith will be even greater. . .
:Q
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
UPDATE:

Havok to compete with AGEIA for physics
many have heard of AGEIA and its startling announcement: they will produce a processor used exclusively to process physics related computations. Called the PPU, or Physics Processing Unit, its role will be to offload highly intensive mathematics such as realistic water movement, realistic character physical reactions to objects and the world, from the CPU to a dedicated processor. This all seems like the natural progression of things, since dedicated sound, network and other processors are commonplace.

Today, however, most processors spend their time mostly idling - you're rarely ever pushing your hardware to its limits consistently. Thus Havok, a company that's well known to game developers, has announced that it has plans to do for you what AGEIA promises, but save you money and maximize your dollar spent at the same time. Indeed, Havok has confirmed with us that they are competing with AGEIA.

The Havok FX engine is what Havok claims will provide the functionality of a PPU, but its approach is entirely different than AGEIA's. What's special about Havok FX is that it's a software engine that is currently based on Havok's widely used physics engines. However, Havok FX is designed to offload many intensive physics functions from the CPU to the GPU. Using technology available in Shader Model 3.0 and beyond, the Havok FX engine will be able to take advantage of unused resources of today's powerful GPU's and put them to use.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I can't seriously see anyone using their graphics card to do physics as they always want more fps or have some higher graphics settings we could run at. Where this might work is if you have an older but still competent graphics card then plug that into your second PCI-E slot and use it for physics. However at the moment all we have is marketing bull - who knows whether ati will be willing to spend the time and money to develop it? It would be hard to make it work well (i.e. competitve with AGEIA), especially as the hardware isn't even designed to be a PPU. It might just be cheaper for ATi to buy AGEIA.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Dribble
I can't seriously see anyone using their graphics card to do physics as they always want more fps or have some higher graphics settings we could run at. Where this might work is if you have an older but still competent graphics card then plug that into your second PCI-E slot and use it for physics. However at the moment all we have is marketing bull - who knows whether ati will be willing to spend the time and money to develop it? It would be hard to make it work well (i.e. competitve with AGEIA), especially as the hardware isn't even designed to be a PPU. It might just be cheaper for ATi to buy AGEIA.
BOTH Havok and ATI disagree with you . . . READ the UPdate.
 

route66

Senior member
Sep 8, 2005
295
0
0
Apoppin, just because 'bandwidth is there' doesn't mean it's 'free' processing. For example, look at all the non-realword benchmarks that test bandwidth that doesn't always relate to real world perfomance.

The fact of the matter is ATi and Havok are using SM3.0 processing power to compute physics - this is taking power away from the GPU. Granted not all SM3.0 features may be being used at the moment, so there's illusion of 'free' processing power, but as more games use these new features there will be less of this 'free' processing power that can be used for free.
 
Mar 19, 2003
18,289
2
71
Originally posted by: route66
Apoppin, just because 'bandwidth is there' doesn't mean it's 'free' processing. For example, look at all the non-realword benchmarks that test bandwidth that doesn't always relate to real world perfomance.

The fact of the matter is ATi and Havok are using SM3.0 processing power to compute physics - this is taking power away from the GPU. Granted not all SM3.0 features may be being used at the moment, so there's illusion of 'free' processing power, but as more games use these new features there will be less of this 'free' processing power that can be used for free.

That's what I'm worried about. The computing cycles have to come from somewhere... We don't know exactly how it'll work yet, but I kind of find it hard to believe that there wouldn't be any hit on 3D performance at all. Plus, the AT article had a quote in there that somewhat worried me too:

"It is definitely the case that load-balancing is a key challenge for both effects physics and graphics. Enabling effects physics via the GPU offers much greater flexibility for addressing that type of problem versus a proprietary physics hardware device that will inevitably sit idle while the GPU may be overtaxed. We believe that two GPU's stand a far better chance of collaborating more effectively."

The last sentence leads me to believe that there may indeed be a performance hit for running both processes (physics and graphics) on the same GPU at the same time (how much, obviously remains to be seen)... In any case, all of this hardware physics stuff is still very much in the "wait and see" stage at this point. I don't know which route will end up being better (and I wouldn't mind paying for a discrete PPU if it has performance advantages over doing it on the GPU), but I am glad at least that other companies are starting to push the general idea (and not just someone we've never heard of before).
 

Alexstarfire

Senior member
Jul 25, 2004
385
1
76
Originally posted by: keysplayr2003
I think they will just incorporate and extra PPU chip (maybe an upgradable socket) on future gens. I think it will have to be standardized with the Ageia chip to avoid monopolistic lawsuits. Just a guess.

The thing is that that is a very stupid idea. The reason having a built in PPU on a video card is that it most likely won't be upgradable. So pretty much if you want to upgrade one you have to upgrade the other, hopefully they both would be upgrades and not one upgrade and one downgrade, but who knows. Pretty much it's bad for the consumer because it just adds another cost to the already growing cost of video cards. I mean, if a regular top-o-the-line video card is like $600 now, just imagine what that'll be if they add a built in PPU. I wouldn't be surprised if it hit the $1000 mark.

Say I only need better graphics and not better physics, well if it's built in I have no choice but to get better graphics and physics. That means that I'm pretty much sacrificing some image quality because I am forced to pay for a new PPU as well. That's just plain stupid if you ask me.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
At any rate, if this is happening, (socket and onboard PPU) it isn't happening on R580. They can say it is all you want, a refresh board is not going to be changed this much. I certainly wouldn't put it past R600 or G80 but R580, not a shot in hell (New socket, new chip/pin out, new PCB etc...)

-Kevin
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Gamingphreek
At any rate, if this is happening, (socket and onboard PPU) it isn't happening on R580. They can say it is all you want, a refresh board is not going to be changed this much. I certainly wouldn't put it past R600 or G80 but R580, not a shot in hell (New socket, new chip/pin out, new PCB etc...)

-Kevin

Perhaps not on the GPU chip itself, but what about perhaps an independent chip?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |