[pcper] Interview: AMD's Richard Huddy

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
http://www.pcper.com/news/General-T...ew-AMDs-Richard-Huddy-June-17th-4pm-ET-1pm-PT

They talk about 3 topics:

Gameworks

AMD is obviously still concerned about it. A few key points I noted from this were:
- Its possible for a game to be released and the shader program is only being run on AMD's hardware because Nvidia has already got a replacement shader in their drivers ready. Which makes it a poor benchmark.
- Nvidia is sharing source code with some developers but they are under contract not to share it with AMD, so AMD can't see it and help them optimise it.

Mantle
- Noncommittal on Linux. Said something like if its the future of gaming then AMD will go there.
- Intend to continue to support it after DX12 is released as they can release direct access to their new hardware features for those companies that want this.
- By the end of the year 2014 they are committed to releasing Mantle as a standard. That means its open to use by ISVs and for IHVs to implement it if they choose to with no licencing fees.
- DX 11 reduction in development resources did happen for Mantle - but they should be given credit for moving the industry on and informing the DX12 process that will improve gaming for everyone.

Freesync v Gsync
- Huddy claims that gsync introduces a frame of latency due to its buffer (I don't think this is the case, I think its there purely for minimum refresh rate refreshes)
- What I gathered from this discussion is that the main difference between the solutions is who is responsible for the minimum refresh rate. In Nvidia's solution the frame is buffered and if the maximum display is reached the screen refreshes. On AMDs solution the GPU is given a range of refresh rates and if it exceeds that with rendering its responsible for sending the same image again.
- Some mention of Freesync just being eDP based variable refresh rates, no mention of vblank or anything else about how it works. Still not clear how AMD is doing this.
 
Last edited by a moderator:

Mand

Senior member
Jan 13, 2014
664
0
0
Freesync v Gsync
- Huddy claims that gsync introduces a frame of latency due to its buffer (I don't think this is the case, I think its there purely for minimum refresh rate refreshes)
- What I gathered from this discussion is that the main difference between the solutions is who is responsible for the minimum refresh rate. In Nvidia's solution the frame is buffered and if the maximum display is reached the screen refreshes. On AMDs solution the GPU is given a range of refresh rates and if it exceeds that with rendering its responsible for sending the same image again.
- Some mention of Freesync just being eDP based variable refresh rates, no mention of vblank or anything else about how it works. Still not clear how AMD is doing this.

G-Sync doesn't introduce latency - this has been tested by third parties. That AMD is this off-base on what Nvidia is doing doesn't give me confidence about why they think their solution will be better.

I'm just going off your summary, but I find the description of "the GPU is given a range of refresh rates" rather confusing...is it actually doing what people first predicted it might do based on CES information, that it would have to try to "guess" the frame interval and just hope it's right?

I'll watch the whole thing in detail later when I have time, will have more of a response then.

Gameworks

AMD is obviously still concerned about it. A few key points I noted from this were:
- Its possible for a game to be released and the shader program is only being run on AMD's hardware because Nvidia has already got a replacement shader in their drivers ready. Which makes it a poor benchmark.
- Nvidia is sharing source code with some developers but they are under contract not to share it with AMD, so AMD can't see it and help them optimise it.

I'm still not clear why AMD thinks it deserves access to Gameworks code just because a game developer is using it.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Some great reasoning by huddy, basically, it has lots of memory, so there must be an extra frame of latency right?

Blurbusters already measured the latency of gsync and it was roughly the same as an unmodded monitor with vsync off.
 
Last edited:

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Wasn't satisfied by that summary so watched the whole thing instead. That was pretty informative, not bad for a one hour long interview, though some of it was dev talk.

Watch Dogs isn't really a great demonstrator for Gameworks, Nvidia needs to pick someone other then Ubisoft who can't seem to optimize anything. Not sure about the Batman devs since those are the guys that broke DX11 release for Arkham City.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If Huddy was at Nvidia or still at Intel, and was in a similar position of employment, he'd be saying or spinning the same arguments from either of those company's point of view.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I like the idea of OpenFX. A liberally-licensed effects library would give Indies an enormous shot in the arm and would really raise the bar for graphics industry-wide.

I seriously hope that wasn't just idle musing and they do end up following through with that. Knowing AMD though... ah, well.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
G-Sync doesn't introduce latency - this has been tested by third parties. That AMD is this off-base on what Nvidia is doing doesn't give me confidence about why they think their solution will be better.

I'm just going off your summary, but I find the description of "the GPU is given a range of refresh rates" rather confusing...is it actually doing what people first predicted it might do based on CES information, that it would have to try to "guess" the frame interval and just hope it's right?

I'll watch the whole thing in detail later when I have time, will have more of a response then.



I'm still not clear why AMD thinks it deserves access to Gameworks code just because a game developer is using it.

How do you know one frame of latency is not added? Do you know something about how g-sync works that the rest of us do not? nVidia has not given any details on any of this. But I have no doubt that AMD has purchased and dived into how nVidia is going it. It would be hard to measure, plus being that g-sync does use a buffer, it would make sense for there to be an extra frame.

Not sure why you think AMD is guessing on the frame interval. The GPU does not have to guess about what it is doing. It knows.

So you think its perfectly fine that nVidia blocks developers from optimizing their OWN GAME for other GPU's? If AMD did this people would be screaming. No true gamer is going to side with nVidia on this. Having game developers be in charge of their own code, and have the freedom to optimize what ever they want is good for gamers, period.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,446
126
If Huddy was at Nvidia or still at Intel, and was in a similar position of employment, he'd be saying or spinning the same arguments from either of those company's point of view.

Yeah... Newsflash: AMD employee thinks that the products whose sales pay for his paycheck are better than those of his competitor. News at 11
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
How do you know one frame of latency is not added? Do you know something about how g-sync works that the rest of us do not? nVidia has not given any details on any of this. But I have no doubt that AMD has purchased and dived into how nVidia is going it. It would be hard to measure, plus being that g-sync does use a buffer, it would make sense for there to be an extra frame.

Not sure why you think AMD is guessing on the frame interval. The GPU does not have to guess about what it is doing. It knows.

So you think its perfectly fine that nVidia blocks developers from optimizing their OWN GAME for other GPU's? If AMD did this people would be screaming. No true gamer is going to side with nVidia on this. Having game developers be in charge of their own code, and have the freedom to optimize what ever they want is good for gamers, period.

blurbusters measured the latency of games with gsync, it's the same as an unmodified monitor running without vsync. All high refresh screens have buffers, heard it was to do overdrive calculations.
 

Mand

Senior member
Jan 13, 2014
664
0
0
How do you know one frame of latency is not added? Do you know something about how g-sync works that the rest of us do not? nVidia has not given any details on any of this.

Not only has Nvidia given quite a bit of details on it that indicate that it does not use a latency-inducing frame buffer, it has been tested and measured by third parties. I know a frame of latency is not added because it it was tested specifically for any added latency, and they measured none.
 
Last edited:

Despoiler

Golden Member
Nov 10, 2007
1,967
772
136
G-Sync doesn't introduce latency - this has been tested by third parties. That AMD is this off-base on what Nvidia is doing doesn't give me confidence about why they think their solution will be better.

Just the fact that there is a buffer automatically means there is additional latency. That's what buffers do. They smooth out delivery of some form of data and trade off in latency. There is also polling involved, which means overhead. Also, third parties have tested it and they do find additional latency. It's not a lot more than v-sync off, but it is there. It also gets worse when you are approaching the lower and upper bounds of the refresh rates.

http://www.blurbusters.com/gsync/preview2/

I'm just going off your summary, but I find the description of "the GPU is given a range of refresh rates" rather confusing...is it actually doing what people first predicted it might do based on CES information, that it would have to try to "guess" the frame interval and just hope it's right?

It's only confusing if you do no research nor do you understand anything about the technology that you are discussing. The GPU is given the refresh rate range via EDID. If the frame rendering time exceeds the refresh rate range the GPU resends the last frame. The GPU knows everything about what is going on when rendering and the capabilities of the monitor. There is no guessing, there is no polling.

I'm still not clear why AMD thinks it deserves access to Gameworks code just because a game developer is using it.

Strawman argument. It's not that they think they deserve the code. Every company that does graphics on any device, including Nvidia, supplies their SDK in source code. Gameworks diverges from that and supplies DLLs. Some, not all, developers are given the source code to the DLLs, but they cannot share that code with AMD or other developers.

Warning issued for personal attack.
-- stahlhart
 
Last edited by a moderator:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Just the fact that there is a buffer automatically means there is additional latency. That's what buffers do. They smooth out delivery of some form of data and trade off in latency. .
Just because there's memory doesn't mean there's a buffer.
 

gorobei

Diamond Member
Jan 7, 2007
3,777
1,226
136
it's a shame you guys are so busy harping on the same tired issues that got the g/free/a sync threads closed that you missed the part where huddy gives a release date outline for the first async monitors.

but im sure everyone here who is posting watched the entire video instead of knee/circle jerking a response.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Just the fact that there is a buffer automatically means there is additional latency. That's what buffers do.

And the fact that there isn't additional latency would indicate that there isn't a buffer, wouldn't it?

And seriously - do we have to start with the personal attacks again?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
it's a shame you guys are so busy harping on the same tired issues that got the g/free/a sync threads closed that you missed the part where huddy gives a release date outline for the first async monitors.

but im sure everyone here who is posting watched the entire video instead of knee/circle jerking a response.

I don't have time to watch it right now. When are the A-sync monitors coming ?!
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I don't have time to watch it right now. When are the A-sync monitors coming ?!

To reviewers Sept/Oct and release Jan/Feb.

He also said that an entire lineup of GPUs would be available at the time these monitors hit the market that support A-sync (currently only GCN 1.1 gpus support it, they're the only ones with the "required complexity" in the display controllers atm). Pretty much confirming the next gen.

EDIT: Also, I was laughing hard as hell when he dared Nvidia to make a blog post releasing devs from NDA so they can talk about the Gameworks licensing agreements.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
G-Sync is only rendering in a buffer when the GPU is faster than the refesh times. Otherwise it is sending the frame immediately to the display.

http://www.pcper.com/news/General-T...ew-AMDs-Richard-Huddy-June-17th-4pm-ET-1pm-PT

They talk about 3 topics:

Gameworks

AMD is obviously still concerned about it. A few key points I noted from this were:
- Its possible for a game to be released and the shader program is only being run on AMD's hardware because Nvidia has already got a replacement shader in their drivers ready. Which makes it a poor benchmark.

So, there is no difference between Gameworks and Gaming Evolved?
Ironic, that AMD doesnt like their own medicine. :sneaky:

BTW: They want to introduce "OpenWorks". Lol, the same nonsense like "OpenPhysics". They will do nothing and hope that somebody will create new effects.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
G-Sync is only rendering in a buffer when the GPU is faster than the refesh times. Otherwise it is sending the frame immediately to the display.

And triggering the display to refresh at that time - when the render is done, and the frame is ready. Which is the whole point.

And in the case when buffering has to happen, this is when the GPU is rendering at a rate higher than the maximum refresh rate of the display. At that point, you basically get vsync-like operation.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
To reviewers Sept/Oct and release Jan/Feb.

He also said that an entire lineup of GPUs would be available at the time these monitors hit the market that support A-sync (currently only GCN 1.1 gpus support it, they're the only ones with the "required complexity" in the display controllers atm). Pretty much confirming the next gen.

EDIT: Also, I was laughing hard as hell when he dared Nvidia to make a blog post releasing devs from NDA so they can talk about the Gameworks licensing agreements.

My guess is they will refresh their mid and lower tier cards up to GCN 1.1, but the 290, 290X and 295X2 will remain their top cards until next year.

Early next year is pretty good for a-sync I guess. There is still only one gsync monitor, next month there will be two. Not exactly much catch-up to do. I think a-sync will quickly eclipse g-sync anyways, proprietary is not going to go anywhere versus a verified VESA standard. Also with display makers having to make a custom implementation of the gsync chip on a panel by panel basis it's not something we'll ever see proliferate.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
So, there is no difference between Gameworks and Gaming Evolved?
Ironic, that AMD doesnt like their own medicine. :sneaky:

With Gaming Evolved AMD sends source code, not DLL's. And there is no clause to prevent the developer from optimizing for other GPU's.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
With Gaming Evolved AMD sends source code, not DLL's. And there is no clause to prevent the developer from optimizing for other GPU's.

Sure, after the game was released and they could use an unstable version to make performance comparision:
http://community.amd.com/community/...ider-launches-with-revolutionary-tressfx-hair

BTW: An internal benchmark is essential for a Gaming Evolved title - surprise, surprise:
Games that benchmark well and have reproducible results that can be reported by AMD and the media are also preferred. Inclusion of an integrated benchmark mode in the game is also a plus as it more likely gets review media interested in including that game in their test suite and also allows the public to run their own tests to compare results.
http://www.pcper.com/news/Graphics-Cards/AMD-Planning-Open-Source-GameWorks-Competitor-Mantle-Linux
 

el etro

Golden Member
Jul 21, 2013
1,581
14
81
Tomb Raider performance was fixed on Nvidia in one or two months. The performance of AMD on splinter cell blacklist(and on batman Nvidia origins) is still not fixed.
 

Despoiler

Golden Member
Nov 10, 2007
1,967
772
136
And the fact that there isn't additional latency would indicate that there isn't a buffer, wouldn't it?

And seriously - do we have to start with the personal attacks again?

Fact? I just provided a link that shows it's not a fact. Pointing out that you are factually incorrect isn't a personal attack. You are the one that claimed you were confused. I agreed with you. You keep making the same claims you've made in the past except we aren't in speculative discussion anymore. AMD has told us how their solution works.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |