Would you care to ask Nvidia a Question?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Keysplayr
Originally posted by: thilanliyan
Like a lot of others want to know...when will the desktop version of GT300 be released? (I know this will probably not be picked as one of your top 5 but I'm willing to bet it's the question most people want answered...after performance figures).

Thilan, Bryan, did you read the OP?

yes, I read it I just feel that the release of gt300 is on everybody's mind right now (including mine). As pissed off as a I am about "cardboard box-gate" and all of the recent delays, I also am very excited about dragon age and I'm concerned that my gtx 260 won't be enough for it. It must be karma that I've been ridiculing nvidia for a long time now about physix and the one must-have title for me this year supports...physix.

Ok keys, here is a question that many people here may find relevant and doesn't violate any of the conditions from the OP. I have a p35 mobo and might end up using my gtx 260 as a physix card when gt300 comes out. Will I need an sli mobo to use the gtx 260 for physix? If not, how much of a performance hit will the 4x slot be for me?
 

alcoholbob

Diamond Member
May 24, 2005
6,292
341
126
Originally posted by: palladium
Sorry, another Q:

How would you rank Ageia's PPU physics processing capability relative to a PhysX capable GPU? (i.e. is it on par with 8600GT, 9800GTX+..?)

This is easy. With non-ATI VGA renderers, it performs badly (Firingsquad has benchmarks). With ATI video cards for rendering, it does not function.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Adrenaline
Since you have multiple people taking questions across several forums would it be possible to grab some other questions that are answered somewhere else and post them here with a possible link?

Some people see only 5 questions being answered but do not see the whole picture of a possibility of say 10 forums having 50 questions answered. This will actually get more info out across several areas.

Many of the exact same questions are being asked across several forums. Of course I will grab the other questions from other forums and post them here as they are answered. And it isn't 5 questions per forum. It's 5 per week total. Chances are though, that the questions that are answered on another forum would be a question that is asked here.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: palladium
Sorry, another Q:

How would you rank Ageia's PPU physics processing capability relative to a PhysX capable GPU? (i.e. is it on par with 8600GT, 9800GTX+..?)

Ageia's PhysX PPU isn't as capable as even a 8600GT for PhysX processing. In most, if not all cases, when running an Agiea PPU alongside a 9800GTX or an 8800GT for example:

http://www.firingsquad.com/har...sx_performance_update/

It shows you're better off just running PhysX on the 9800GTX or 8800GT while it's also the primary rendering card. This graph is a bit old, but you can see the pattern. While the PPU absolutely is faster than if running PhysX in software, it cannot compare to the PhysX processing power of even an 8600GT. My own tests have shown the 8600GT to offer at least some improvement over a standalone Nvidia GPU.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: bryanW1995
Originally posted by: Keysplayr
Originally posted by: thilanliyan
Like a lot of others want to know...when will the desktop version of GT300 be released? (I know this will probably not be picked as one of your top 5 but I'm willing to bet it's the question most people want answered...after performance figures).

Thilan, Bryan, did you read the OP?

yes, I read it I just feel that the release of gt300 is on everybody's mind right now (including mine). As pissed off as a I am about "cardboard box-gate" and all of the recent delays, I also am very excited about dragon age and I'm concerned that my gtx 260 won't be enough for it. It must be karma that I've been ridiculing nvidia for a long time now about physix and the one must-have title for me this year supports...physix.

Ok keys, here is a question that many people here may find relevant and doesn't violate any of the conditions from the OP. I have a p35 mobo and might end up using my gtx 260 as a physix card when gt300 comes out. Will I need an sli mobo to use the gtx 260 for physix? If not, how much of a performance hit will the 4x slot be for me?

You do not need an SLI capable motherboard.
If you've noticed, many of the new P55 motherboards advertise "PhysX Ready". This could be on an SLI P55 board with two (x8 + x8) slots, and one 4x slot dedicated to PhysX. Or, you can utilize another model which does not have SLI capability having only one 16x PCI-e slot for the primary card, and one 4x slot for PhysX. Your GTX260 would run perfectly in a 4x slot in a PhysX processing capacity without a performance hit. GPU's processing PhysX apparently do not need more than 4x.
 

zebrax2

Senior member
Nov 18, 2007
972
62
91
Suggestion
Create a new thread weekly instead of using one thread. Post all previously answered questions to the new thread then lock the old one.
 

Blazer7

Golden Member
Jun 26, 2007
1,136
12
81
What are the main aspects of forceware drivers that nV is most anxious to add/improve and what will these changes mean for the end user?

Is nV working on a ?cure? for micro-stuttering in multi-GPU setups? And if so, when are we gonna see some results?

Currently most SLI profiles use AFR. Is there a plan to improve SFR so that this will become the ?default? rendering scheme for multi-GPU setups?
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
What card would nvidia recommend for a physx standalone card? I grabbed a 9800gt planning on getting a better card and demoting this one to physx only, but I kind of took a shot in the dark picking that card as I couldn't really find good info about it.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Originally posted by: Keysplayr
Originally posted by: Adrenaline
Since you have multiple people taking questions across several forums would it be possible to grab some other questions that are answered somewhere else and post them here with a possible link?

Some people see only 5 questions being answered but do not see the whole picture of a possibility of say 10 forums having 50 questions answered. This will actually get more info out across several areas.

Many of the exact same questions are being asked across several forums. Of course I will grab the other questions from other forums and post them here as they are answered. And it isn't 5 questions per forum. It's 5 per week total. Chances are though, that the questions that are answered on another forum would be a question that is asked here.

Hmm hard to think of an original question since most of the members here gave good questions/feedback ,however have or will Nvidia think about going into the cpu market?
 

thilanliyan

Lifer
Jun 21, 2005
11,943
2,171
126
Originally posted by: bryanW1995
It must be karma that I've been ridiculing nvidia for a long time now about physix and the one must-have title for me this year supports...physix.

I seriously doubt it will be GPU physics though...so the PhysX effects will be running on your CPU anyway. I've tried searching but the best I could find were sites stating that PhysX is integrated into Bioware's Eclipse engine...which basically means (from the lack of any hype about it) that there is NO GPU PhysX in Dragon Age. And considering this is a multiplatform release, there would have been some announcement about it (you can bet nV would have been singing about it for such a AAA title) if it was in fact going to use GPU PhysX. So don't worry...your karma is still intact.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I don't think a game engine has to do anything special to get GPU acceleration of PhysX. The DLL implementing the API has the option of offloading processing onto one or more GPUs if it feels like it, but the engine relying on that API wouldn't need to even know.

So far as anyone knows DA will only use PhysX for arrow flight and collision, so would not require GPU acceleration. ATI users are in the clear so long as they have a decent CPU. In fact, users with less manly GPU and really well hung CPUs may wish to disable hardware acceleration even with an NV card. Users with less manly CPUs will need a potent NV card.

Quote from Ross Gardener:
Nvidia now owns PhysX, and have put in full hardware support for it on all their latest cards - and some games have added hardware acceleration specific features to their games (that you'd only see on Nvidia cards). We didn't do any of those on DAO so you're fine with an ATI. Technically there would be some improvement to have a hardware PhysX card - but it might be so minor you'd never notice.

Smart developer.
 

thilanliyan

Lifer
Jun 21, 2005
11,943
2,171
126
Originally posted by: v8envy
I don't think a game engine has to do anything special to get GPU acceleration of PhysX. The DLL implementing the API has the option of offloading processing onto one or more GPUs if it feels like it, but the engine relying on that API wouldn't need to even know.

Yeah I was thinking of the added effects like in Arkham and Cryostasis that required GPU acceleration. So mostly everyone is in the clear then for DAO it looks like.
 

Mr Fox

Senior member
Sep 24, 2006
876
0
76
Originally posted by: Janooo
Q: My not even 2 years old 6150Go laptop died due to the faulty bumps. Can NV cover all chipsets and gpus that are affected?
Thanks.

P.S. I am really disappointed. If I don't get this issues reasonably fixed NV will not see any money from me any more.



Ditto for me HP DV9000 US... just over 2 years started after 6 mos of ownership
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,947
126
Does Nvidia have good coffee around the office? Is it a good whole bean roast ground fresh for each pot?

I know its a light question but I'm totally serious
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Lots of great questions everyone. Don't get discouraged if your particular question does not get answered right away. They will in a relatively short amount of time.
This thread is basically the "collector" thread. Every question that has been asked, with some exceptions, has been collected for discussion and possible submission to Nvidia for this week. Only 5 initially between forums. If they can address more than 5 at a given time, I'm sure some questions are simpler than others, you'll be the first to know.

As the first answers from Nvidia get back to us, I'll make a new thread just to keep things neat and clean.

P.S. JSt0rm01. Do you really want me to waste a question to NV about coffee? I think it's safe to say, that if they do have coffee in their offices, it's probably decent coffee.
Nvidia employees seemed very happy individuals (at least those I have met) and that whole precedent is due to damn good coffee.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Hmmm, this isn't particularly about any upcoming hardware that I have heard about, but a general question I am rather interested in.

Given the ever shrinking die space used per transistor, do you see a point coming in the next few years where moving to eDRAM at least enough for the frame buffer would be a worth while useage of die space particularly considering the massive bandwidth benefit it would offer. Obviously there would still need to be consideration given to the maximum resolution/AA supported at this point, but it seems that the benefits to the majority of gamers could be rather large and with the increasing demands of GPGPU it could also serve as a rather large L3.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: BenSkywalker
Hmmm, this isn't particularly about any upcoming hardware that I have heard about, but a general question I am rather interested in.

Given the ever shrinking die space used per transistor, do you see a point coming in the next few years where moving to eDRAM at least enough for the frame buffer would be a worth while useage of die space particularly considering the massive bandwidth benefit it would offer. Obviously there would still need to be consideration given to the maximum resolution/AA supported at this point, but it seems that the benefits to the majority of gamers could be rather large and with the increasing demands of GPGPU it could also serve as a rather large L3.

Does anyone do eDRAM without SOI for the substrate? I know Power and Cell with their eDRAM are SOI chips, but has anyone implemented traditional trench capacitor DRAM cell into an eDRAM IC for bulk-Si and commercialized it yet?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: thilanliyan
Originally posted by: bryanW1995
It must be karma that I've been ridiculing nvidia for a long time now about physix and the one must-have title for me this year supports...physix.

I seriously doubt it will be GPU physics though...so the PhysX effects will be running on your CPU anyway. I've tried searching but the best I could find were sites stating that PhysX is integrated into Bioware's Eclipse engine...which basically means (from the lack of any hype about it) that there is NO GPU PhysX in Dragon Age. And considering this is a multiplatform release, there would have been some announcement about it (you can bet nV would have been singing about it for such a AAA title) if it was in fact going to use GPU PhysX. So don't worry...your karma is still intact.

keys, can you verify if this is correct?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Well, we have this quote from Ross Gardner, the lead programmer at Bioware. This quote if from June '09:

"I'll repost my reply from the requirements thread here:

As for PhysX I don't think it is a secret that DAO is using PhysX. We don't have any special features for hardware acceleration ONLY, however the version we use supports it if your card does. Hopefully I'm allowed to say that

As for how much is there, it is not a physics heavy game - although we are running a full simulation.

Thilan, it's still far too early to know how exactly PhysX will be running in DAO. In other words, will there be "levels" of Physics immersion or complexity depending on if it is run on CPU or Nvidia GPU? Show us what you have learned about it. If you've read something more recent that goes into more detail, bring it on in.

From what Ross hints at here, it would seem that running PhysX in software (CPU) is supported, but if the GPU supports PhysX, then it will run on the GPU. All we could do is guess that PhysX would run better on the GPU, but again, too early to say and we don't know the level of PhysX in the game.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: bryanW1995
keys, can you verify if this is correct?

Don't worry, it's gonna be "software" PhysX that can be run on the GPU if you have an nV card. But it will be on a level that won't kill framerate, cause the CPU part is an unoptimized mess. Bioware doesn't want to alienate many many gamers and will give the full experience to everybody

Quote from Ross Gardner
Nvidia now owns PhysX, and have put in full hardware support for it on all their latest cards - and some games have added hardware acceleration specific features to their games (that you'd only see on Nvidia cards). We didn't do any of those on DAO so you're fine with an ATI. Technically there would be some improvement to have a hardware PhysX card - but it might be so minor you'd never notice.
 

thilanliyan

Lifer
Jun 21, 2005
11,943
2,171
126
Originally posted by: Keysplayr
Show us what you have learned about it. If you've read something more recent that goes into more detail, bring it on in.

What Qbah posted above me is as far as I know which pretty much says the difference between CPU and GPU physx in DAO will be negligible.
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,947
126
Originally posted by: Keysplayr
P.S. JSt0rm01. Do you really want me to waste a question to NV about coffee? I think it's safe to say, that if they do have coffee in their offices, it's probably decent coffee.
Nvidia employees seemed very happy individuals (at least those I have met) and that whole precedent is due to damn good coffee.

dude its a great question! Have a couple fun questions thrown in there too
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
"Does Nvidia have good coffee around the office? Is it a good whole bean roast ground fresh for each pot?"

Allright. I'll submit it. But don't count on this one getting answered right away. But who knows?

EDIT: heh, looks like you've got your answer! Check the OP.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: thilanliyan
Originally posted by: Keysplayr
Show us what you have learned about it. If you've read something more recent that goes into more detail, bring it on in.

What Qbah posted above me is as far as I know which pretty much says the difference between CPU and GPU physx in DAO will be negligible.

well, since I have a gtx 260 - 216 and a 9600gso I'll find out for you guys.
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,947
126
Originally posted by: Keysplayr
"Does Nvidia have good coffee around the office? Is it a good whole bean roast ground fresh for each pot?"

Allright. I'll submit it. But don't count on this one getting answered right away. But who knows?

EDIT: heh, looks like you've got your answer! Check the OP.

hahahaha awesome :thumbsup:
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |