Yes. If AMD is using AC and killing nvidia performance is very right in beta games ,however, if Nvidia use its features which AMD cannot use and use tessellation more in most AAA games then it is gimping. Some poster are really single minded here.
did he write something wrong?
put link please
i havent read something wrong.
hust he smashing sotin post´s whit facts
Yes. If AMD is using AC and killing nvidia performance is very right in beta games ,however, if Nvidia use its features which AMD cannot use and use tessellation more in most AAA games then it is gimping. Some poster are really single minded here.
Plz do see Fable legends benchmark.This is why the exchange of information is great, people can express differing views, and make up their own mind, after hearing pros and cons of competing techs... I certainly welcome it... I believe MS's DX 12 is where the games will be headed, and any tech that leverages it will benefit me more than the alternate.... now that's my opinion, and it will shape my purchases... others may arrive at a different conclusion, and that's their prerogative...
According to this, 1Q16 looks rather gloomy for AMD
http://www.digitimes.com/news/a20160223PD208.html
For Q1 2016, based on a 13 week quarter, we expect:
Revenue to decrease 14% sequentially, +/- 3%, driven by game console seasonality and a cautious macro environment in China.
Gross margin to be approximately 32%.
Non-GAAP operating expenses to be approximately $320 million.
Interest expense, taxes and other to be approximately $42 million.
Cash and cash equivalents balances down approximately $100 million from the end of the fourth quarter, including approximately $70 million of cash interest payments. This does not include any cash proceeds related to the joint venture with Nantong Fujitsu Microelectronics.
Inventory to be flat from Q4 2015 levels.
Our fiscal year 2016 is based on 53 weeks and we will take the extra week in our fiscal fourth quarter.
So Hitman 2016 which runs like grabage on Nvidia hardware and even 2013 graphic was better is a good game right because AMD performance is better on it and fury X manage to get 60fps on 1080p.
That is what i said that some poster are single minded.
HITMAN is not released yet.
Tell me any of his post his claimed right by AMD or nvidia?
Well, the Async Compute advantage to AMD due to hardware scheduling and the small, but not insignificant performance benefit they receive from it is certainly a fact.
It is somewhat speculative to pin that as one of the main reasons why AMD's market share increased, though anecdotally that certainly seems to factor into purchasing decisions.
The division is not released yet but it has much better performance and graphics for both camps. You just defend that game because it is AMD sponsored.
I do not know and i do not claims thing. I know some posters here know more then AMD and Nvidia ceo and employees. Far more intelligent then them ,however, i am not.is on dev documentation open for everybody, i like to read so i read those and what he says have a direct correlation whit said documentation.
And i asked for link to know. so if you are kind enough cand you please put it.
yes you are right
Compare the division and hitman graphics and performance then come back i will be here.
Plz do see Fable legends benchmark.
AMD has apparently been able to offload up to 30% of a workload to async compute, making the 18% seen here as a rather pleasant and good example of async compute usage
I am not claiming stuff.I see you forgot about your accusation, good
we can carry on with the thread now.
More then 70% of DGPU users can use gameworks.
This is a sick joke right?
Every buddy of mine with a 970/980 disabled HairWorks in Witcher 3 cos it KILLED their minimum frames per second 3 times worse than it did their average fps. Unplayable even on NV GPUs, it's a utter disgrace of a feature. It was so bad CDPR had to get source code for HairWorks, to set Tessellation to x32 from x64 and added sliders for x8/x16 in a patch. Because they caved after their forums were filled with complaints.
hbao+, god rays, volumetric light.
Oops, sorry about that NVIDIA
I think a few of us played a role in this when we sorta, kinda kicked off the Async Compute controversy. When I invited Kollock over to overclock.net to comment after my theory on the matter the issue blew up. That probably played a role...
That is why i have told mods and posted in Nvidia and AMD_roy twitter about him.If you think some random forum PR is going to change anything in the market you are wrong. Whatever people post on this or any other forum got no influence in the overall share. Consumers/OEMs are smarter than that.
According to this, 1Q16 looks rather gloomy for AMD
http://www.digitimes.com/news/a20160223PD208.html