[PCGamer, Gamepur] Batman Arkham Knight System Requirements, May Receive DX12 Patch

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As a point of Reference

Arkham City - 1080P VHQ
GTX580 = 122 fps (PhysX Off)
GTX580 = 65 fps (PhysX High)

Arkham Origins - 1080P VHQ
GTX780 = 131 fps (no AA, PhysX Off)
GTX780 = 102 fps (4xMSAA, PhysX Off)
GTX780 = 76 fps (no AA, PhysX High)
GTX780 = 62 fps (4xMSAA, PhysX High)

Batman Arkham Knight doesn't look that much more demanding than Origins. I presume a GTX970 will be able to maintain 50-60 fps with 4xMSAA and PhysX High at 1080P, and with an overclock close to 1.5Ghz, 60 fps should be a reasonable expectation.

For 2560x1440 and 2560x1600 gamers, you probably won't be able to get 60 fps averages with a single card with MSAA+PhysX High.

 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Neither Arkham Asylum or City were gameworks.

Technically true, but "Gameworks" didn't exist when Asylum and City were made. They were still Nvidia-supported games under the "The Way It's Meant To Be Played" brand, and had features like PhysX, MSAA (which was rare for an Unreal Engine 3 game), and ambient occlusion.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Technically true, but "Gameworks" didn't exist when Asylum and City were made. They were still Nvidia-supported games under the "The Way It's Meant To Be Played" brand, and had features like PhysX, MSAA (which was rare for an Unreal Engine 3 game), and ambient occlusion.

Which all helped make the game more immersive. The gamesworks features that a few recent games have had just made them look the same. I'd rather do without it tbh.

I expect Witcher 3 will make better more unique use of whatever they decide suites the game, instead of lumping a load of standard effects together.
 
Last edited:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
The previous post I made on the subject of Gameworks and batman. And I do like the Batman games I've played. I'm posting in the thread because of that and because the subject of Gameworks came up, and it's a shame that it's on games I would otherwise play and enjoy.

Throughout all of my years involved in PCs, i am talking since 1990s, i have seen people build entire PCs just to play a single game. Over all these years, I have seen countless people upgrade GPUs, CPUs, ram, operating systems, etc- just to play a single game at the settings they want to play them.

I am not trying to change your mind, just making an observation. This new stance i have been seeing on anandtech, i find it completely at odds with everything i have ever seen when it comes to PC gaming. I know there can be emotional attachments to certain brands and HW, but i have never seen anything like this before. Dont get me wrong, I do not mean any disrespect. I am not even saying your stance is wrong. Just saying.

I dont think it will ever catch on, this idea to boycott games because someone with an nvidia card might get higher frame rates. I remember a time when you couldnt play all new games without changing out graphics cards. Games you almost couldnt play at all without a specific graphics cards.

It is not even like that today. Batman will run on AMD HW as well as NV hardware. Nvidia pays money for extra bells and whistles, that can be turned on or off. Who really cares if a guy with an nvidia card gets x% higher frame rates in a game that nvidia put a lot of work into.........

sorry!
that is just me thinking out loud. I am not trying to change your mind. Just my thoughts on it
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Which all helped make the game more immersive.

For NV users, yes, while alienating everyone with an Intel and AMD card. PhysX, TXAA, you can't use any of those features on those other products. 85% of the gaming market is Intel+AMD. So who exactly did it benefit for Rocksteady to target the game 100% at NV cards only? We know why Rocksteady did it - marketing and $$$. Why else would they do it?

Let me ask you this, imagine if Rocksteady made in-game physics effects (ala Red Faction Guerilla) that work on Intel, NV and AMD hardware. Wouldn't that be better? Secondly, imagine if Rocksteady spent more time on optimizing AA to include the superior SMAA modes instead of TXAA screen-smudge/blurfest? Thirdly, wouldn't it be better if Rocksteady spent time on improving textures and in game shaders instead of worrying about "Nvidia DOF", "PCSS+" and "HBAO+"? All 3 of those features can be easily substituted with standard effects that look as good if not better.

For example if we look at PCSS+ in GTA V:

"In terms of performance, we found Rockstar's Soft Shadow method is by far the fastest. Enabling AMD CHS takes performance down some from there and NVIDIA PCSS taking the largest performance hit on both GPUs. AMD CHS and NVIDIA PCSS were a VRAM savings on the GTX 980, but were all similar on the R9 290X in terms of VRAM."

So you would think well it's the fastest but looks the worst, but it doesn't! It looks the best too.





"....but Rockstar's "Softest" shadow option to us could potentially look better than both AMD CHS or NVIDIA PCSS. Both AMD CHS and NVIDA PCSS have issues we have noticed in our early look at the game.

In regards to NVIDIA PCSS at times there is no shadow at all displayed in an area where AMD CHS or Rockstar's settings show a shadow. The shadows are also a lotmore blurry in some cases, especially dealing with leaves projected onto objects."
- HardOCP

So in reality, various AMD and NV specific features often run worse AND look worse than the developer's version. Therefore, I don't at all agree that NV's GW's actually benefits Batman games vs. what the developer could have accomplished if it put all its might and resources behind making the PC version stand-out. I am not a fan of AMD GE or NV's GW's and wish they would be abandoned completely. Let the developer decide how they want the game to look. By not using vendor specific features, the developer is then forced to optimize for Intel, NV and AMD users and they are forced to be creative on their own instead of using some close-sourced proprietary NV's game code for HairWorks or NV's RainWorks, etc.

At this pace why not substitute nearly every single shader in your game then since well NV has one for nearly every effect? NV WaterWorks, HairWorks, RainWorks, ShadowWorks, PhysX, heck, might as well have NV or AMD make 50% of the game's shaders while the developer takes a vacation!! Who is making the game, the developer or NV/AMD?
 
Last edited:

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
As a point of Reference

Arkham City - 1080P VHQ
GTX580 = 122 fps (PhysX Off)
GTX580 = 65 fps (PhysX High)

Arkham Origins - 1080P VHQ
GTX780 = 131 fps (no AA, PhysX Off)
GTX780 = 102 fps (4xMSAA, PhysX Off)
GTX780 = 76 fps (no AA, PhysX High)
GTX780 = 62 fps (4xMSAA, PhysX High)

Batman Arkham Knight doesn't look that much more demanding than Origins. I presume a GTX970 will be able to maintain 50-60 fps with 4xMSAA and PhysX High at 1080P, and with an overclock close to 1.5Ghz, 60 fps should be a reasonable expectation.

For 2560x1440 and 2560x1600 gamers, you probably won't be able to get 60 fps averages with a single card with MSAA+PhysX High.


Graphics never match well with performance. On top of that, we've only seen PS4 footage up until this point in spite of the game looking noticeably better than the PC version of Origins. I'm even sure how reliable comparing Origins is in the first place though. City runs worse than Origins in DX11, so Rocksteady might just suck at PC optimizations.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Throughout all of my years involved in PCs, i am talking since 1990s, i have seen people build entire PCs just to play a single game. Over all these years, I have seen countless people upgrade GPUs, CPUs, ram, operating systems, etc- just to play a single game at the settings they want to play them.

I am not trying to change your mind, just making an observation. This new stance i have been seeing on anandtech, i find it completely at odds with everything i have ever seen when it comes to PC gaming.

Absolutely not. This is actually 100% consistent with what PC gaming has always stood for - open source, sharing game code for the benefit of all PC gamers. Boycotting GW's games is not a new stance on old issues as you are trying to portray it as "new stance". This is an entirely new issue that PC gaming never faced which is why your example is at odds since you are making a connection when there is none. Never in the history of gaming graphics, did ATI or NV provide proprietary closed-source game code that could never be altered, shared, optimized by the developer but yet the developer was obligated to insert vendor specific code into the game as part of the "marketing partnership" -- that all ended with GW. So you are mixing completely different topics. The passion for building a new rig and getting a $700 graphics card for 1 game has nothing to do with the business ethical issues that surround GW. GW's is 100% unfair competition since it means the firm with more $ gets to alter the GPU industry by specifically catering the game's source code to benefit its products. This would be akin to Porsche working directly with Michelin or Pirelli to specifically alter the engineering of high-performance rubber and NO one else in the world would be able to use it without Porsche's permission. You know what Michelin and Pirelli would tell Porsche? Go home buddy! While you can often find examples of car manufacturers working directly with some supercar maker to make specific tires for the vehicle, the ultimate goal of Michelin and Pirelli for the supercar market is to make high performance tires that benefit all supercar makers!

Does it sound like the developer partnering with NV's GW's benefits 85% of the Intel+AMD gaming market? Can you imagine if Intel threw billions of dollars at Rocksteady, Rockstar, Blizzard instead and all of those titles would be 100% optimized for Intel's graphics pipeline/drivers?

For example, I did buy 8800GTS over X1900 series for Crysis 1 because at that time NV's option was superior. However, if I knew ATI or NV set up "marketing alliances" and as a result put a bunch of closed black box code into say Far Cry 1 or Crysis 1 that purposely hurt optimization for the competitor and Intel's graphics, I would either not buy the game or wait until it hits $5. So your example makes no sense because we never had a situation before where a hardware vendor provided source code that could NEVER be altered, shared, modified in any way without NV's/AMD's permission. NV's old TWIMTPB was much like AMD's GE today where all of the code is shared which meant even if the firm worked with the developer directly, Intel and AMD would eventually have access to make their own optimizations. If AMD followed NV's GW's and started doing the same, all those games I'd stop buying or waiting until they hit $5.

However, starting with AC DX10.1 fiasco, NV stopped playing fair imo. AMD could have easily hid all the DirectCompute code they used in every single AMD GE title from NV/Intel but did they? No. How some people don't understand the difference between NV's TWIMTPB/AMD's GE vs. NV's new GWs is to this day eye-opening.

Now, if you don't care about business ethics in gaming software development, no problem, but let's not try and pretend that this has existed in PC gaming for decades and that PC gamers have "changed to a new stance." I personally have no problem if Intel/AMD/NV work closely with developers to optimize the game's code for their GPU architectures, but if so, all of the code must be shared with everyone. Otherwise, whoever has more engineers and more marketing $ automatically wins. I don't consider this "fair competition". In that case, Intel's graphics could have buried both NV and AMD a long time ago if we go down that path.

Graphics never match well with performance. On top of that, we've only seen PS4 footage up until this point in spite of the game looking noticeably better than the PC version of Origins. I'm even sure how reliable comparing Origins is in the first place though. City runs worse than Origins in DX11, so Rocksteady might just suck at PC optimizations.

Maybe I have much higher standards than most PC gamers or I can separate gameplay from graphics of my favourite franchisees more easily, but I haven't been impressed with graphics of any new game released since Crysis 3/Ryse Son of Rome. DAI, GTA V, Dying Light, AC Unity, Alien Isolation, etc. - nothing special that we haven't seen before imo. All of these look like current games or at best 1st gen wave for the PS4 generation, not next gen PC games. There is absolutely no graphical next gen leap in any of those relative to Crysis 3 and Ryse Son of Rome. The graphics in the Order 1886 blow GTA V away for example and yet most PC gamers are salivating over GTA V's graphics. It's not even close, yet PC gamers won't acknowledge this because to most of them "consoles suck".

If we look at the trailers for Batman AK, the graphics don't look impressive at all. Poor textures, last generation character movement/physics models, lighting that doesn't look next gen in any way.
https://www.youtube.com/watch?v=V-DBvDejInI

2 years ago, UE4 showed us what next gen real time PC graphics should look like. Thus far, not a single game since Crysis 3 lived up to this hype. None. GTA V or AC Unity or Batman AK are so far behind "next gen" level of graphics, might as well consider them 1-2 generations away from this:
https://www.youtube.com/watch?v=dO2rM-l-vdQ

It's very disappointing imo. I think TW3 will be the first game where we'll say OK finally this is a first true next gen game.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
For NV users, yes, while alienating everyone with an Intel and AMD card. PhysX, TXAA, you can't use any of those features on those other products. 85% of the gaming market is Intel+AMD. So who exactly did it benefit for Rocksteady to target the game 100% at NV cards only? We know why Rocksteady did it - marketing and $$$. Why else would they do it?

Let me ask you this, imagine if Rocksteady made in-game physics effects (ala Red Faction Guerilla) that work on Intel, NV and AMD hardware. Wouldn't that be better? Secondly, imagine if Rocksteady spent more time on optimizing AA to include the superior SMAA modes instead of TXAA screen-smudge/blurfest? Thirdly, wouldn't it be better if Rocksteady spent time on improving textures and in game shaders instead of worrying about "Nvidia DOF", "PCSS+" and "HBAO+"? All 3 of those features can be easily substituted with standard effects that look as good if not better.

For example if we look at PCSS+ in GTA V:

"In terms of performance, we found Rockstar's Soft Shadow method is by far the fastest. Enabling AMD CHS takes performance down some from there and NVIDIA PCSS taking the largest performance hit on both GPUs. AMD CHS and NVIDIA PCSS were a VRAM savings on the GTX 980, but were all similar on the R9 290X in terms of VRAM."

So you would think well it's the fastest but looks the worst, but it doesn't! It looks the best too.





"....but Rockstar's "Softest" shadow option to us could potentially look better than both AMD CHS or NVIDIA PCSS. Both AMD CHS and NVIDA PCSS have issues we have noticed in our early look at the game.

In regards to NVIDIA PCSS at times there is no shadow at all displayed in an area where AMD CHS or Rockstar's settings show a shadow. The shadows are also a lotmore blurry in some cases, especially dealing with leaves projected onto objects."
- HardOCP

So in reality, various AMD and NV specific features often run worse AND look worse than the developer's version. Therefore, I don't at all agree that NV's GW's actually benefits Batman games vs. what the developer could have accomplished if it put all its might and resources behind making the PC version stand-out. I am not a fan of AMD GE or NV's GW's and wish they would be abandoned completely. Let the developer decide how they want the game to look. By not using vendor specific features, the developer is then forced to optimize for Intel, NV and AMD users and they are forced to be creative on their own instead of using some close-sourced proprietary NV's game code for HairWorks or NV's RainWorks, etc.

At this pace why not substitute nearly every single shader in your game then since well NV has one for nearly every effect? NV WaterWorks, HairWorks, RainWorks, ShadowWorks, PhysX, heck, might as well have NV or AMD make 50% of the game's shaders while the developer takes a vacation!! Who is making the game, the developer or NV/AMD?

I do understand your frustration but I was talking specifically about the batman games. I do wish that AMD had taken Nvidia's offer to license hardware PhysX, it might have changed the landscape of gaming at the time. but I can understand why they wouldn't have wanted to just give it away after the acquisition and development.

As for GTA shadows.

1: GTA wasn't specifically for either AMD or Nvidia, in fact AMD's marketing (or at least someone) announced that it was a gaming evolved title based on the fact it had CHS shadows when it offers you the option for both.

2: they are optional, and quite different in nature. The screenshots you have don't really do them justice as Nvidia's start off sharp and get progressively blurrier the farther out they go, imitating how real shadows, or slightly exaggerated versions of them. This is especially effective when going from day to night or vice versa when you can see the sun casting long shadows over the landscape. There is nothing stopping AMD from developing their own version, but they went with CHS.

I've already mentioned my disdain for having all the gameworks games look alike and it does put me off them but I will definitely be getting the next Batman game because I enjoyed the first to so much.

i wholeheartedly agree about TXAA vs SMAA, I never use TXAA and always try and get SMAA into the mix.

I am wondering where you got your 85% statistics from. According to the steam survey only 28.54% of the systems surveyed have an AMD card.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Throughout all of my years involved in PCs, i am talking since 1990s, i have seen people build entire PCs just to play a single game. Over all these years, I have seen countless people upgrade GPUs, CPUs, ram, operating systems, etc- just to play a single game at the settings they want to play them.

I am not trying to change your mind, just making an observation. This new stance i have been seeing on anandtech, i find it completely at odds with everything i have ever seen when it comes to PC gaming. I know there can be emotional attachments to certain brands and HW, but i have never seen anything like this before. Dont get me wrong, I do not mean any disrespect. I am not even saying your stance is wrong. Just saying.

I dont think it will ever catch on, this idea to boycott games because someone with an nvidia card might get higher frame rates. I remember a time when you couldnt play all new games without changing out graphics cards. Games you almost couldnt play at all without a specific graphics cards.

It is not even like that today. Batman will run on AMD HW as well as NV hardware. Nvidia pays money for extra bells and whistles, that can be turned on or off. Who really cares if a guy with an nvidia card gets x% higher frame rates in a game that nvidia put a lot of work into.........

sorry!
that is just me thinking out loud. I am not trying to change your mind. Just my thoughts on it

That kind of thing is bad for PC gaming. It's already hard enough to make games run on all of the different hardware in PC's without purposely sabotaging one of the two main IHV's. Compatibility is the cornerstone of the development of the PC.

I'm not trying to change anyone's mind either. Just clarifying why I'm against such practices.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,126
5,998
136
How many games can you think of where you see this sort of brand bias? I wouldn't be surprised if 980 was 40-50% faster than an R9 290X in this title. ^_^

Dying Light was pretty Nvidia biased as well. I remember Tom's Hardware doing a benchmark where the GTX 960 and R9 290x were neck and neck.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I know, perhaps I am old fashioned but I don't believe in the sabotage theory at all. I think it is exactly how I have repeated many times, that nvidia spends extra time and man hours early on in these games and have optimized to the highest degree. There is nothing stopping AMD from working with these developers just as well, just like the did with rockstar.

The fact that developers can't share gameworks code gets turned into developers can't work with AMD at all. I do not believe that for one second. It is just as far fetched as saying nvidia sabotages AMD performance, what developer would allow that? That would be totally illegal anyway.
The game works features, such as HBAO+ is nvidia code that they spend their time and money implementing. These bolt on features are optional and half of them don't even work on AMD hardware at all. Every game I can think of are defaulted without nvidia features, the game in its vanilla form, just as the developer gave us. Even on nvidia HW, you have to enable them. And just as RS already posted above, these features aren't automatically better than what the developer was giving us anyway.

So what if nvidia has added an extra feature that no one forces you to use.
This is getting demonized as people play AMD as a victim.

It gets old after a while. We turn every single thread into the same thing. Batman AK will be a amassing game so it doesn't much matter how much a few overly eager people want to hate on it. It just gets old to hear the same song over and over. People who aren't even interested in getting the game are here to do what exactly? Fighting the good fight against evil nvidia, yeah!
Give me a break
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I am wondering where you got your 85% statistics from. According to the steam survey only 28.54% of the systems surveyed have an AMD card.

I think you misread what I stated. 85% of the gaming market is Intel+AMD. By virtue of the developer supporting GWs, they are alienating 85% of the gaming market that either cannot use GW features or suffers a huge penalty when they are enabled since neither Intel nor AMD can optimize for any of the proprietary source code NV puts into the GW title.


http://jonpeddie.com/press-releases...m-last-quarter-intel-dropped-4-amd-slipped-7/

Intel will continue to improve its gaming performance and as a result more gamers will be able to play games on Intel graphics. Looking at this long-term, the more developers use GW features, the more it will hurt gamers on Intel graphics too.

Batman: Arkham Origins - Intel HD Graphics 4600

Let's assume Intel will double its GPU performance every 3 years. Intel Iris 6200 will likely score ~ 2500 in 3DMark11. That means in 9 years, Intel's "free" graphics will be as fast as GTX980M SLI.
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html

All these GWs games made today, well those Intel gamers in 2024 won't be able to use TXAA, PhysX, etc. and at the pace we are going, that will be 80%+ of the entire GPU market!
 
Last edited:
Aug 11, 2008
10,451
642
126
Absolutely not. This is actually 100% consistent with what PC gaming has always stood for - open source, sharing game code for the benefit of all PC gamers. Boycotting GW's games is not a new stance on old issues as you are trying to portray it as "new stance". This is an entirely new issue that PC gaming never faced which is why your example is at odds since you are making a connection when there is none. Never in the history of gaming graphics, did ATI or NV provide proprietary closed-source game code that could never be altered, shared, optimized by the developer but yet the developer was obligated to insert vendor specific code into the game as part of the "marketing partnership" -- that all ended with GW. So you are mixing completely different topics. The passion for building a new rig and getting a $700 graphics card for 1 game has nothing to do with the business ethical issues that surround GW. GW's is 100% unfair competition since it means the firm with more $ gets to alter the GPU industry by specifically catering the game's source code to benefit its products. This would be akin to Porsche working directly with Michelin or Pirelli to specifically alter the engineering of high-performance rubber and NO one else in the world would be able to use it without Porsche's permission. You know what Michelin and Pirelli would tell Porsche? Go home buddy! While you can often find examples of car manufacturers working directly with some supercar maker to make specific tires for the vehicle, the ultimate goal of Michelin and Pirelli for the supercar market is to make high performance tires that benefit all supercar makers!

Does it sound like the developer partnering with NV's GW's benefits 85% of the Intel+AMD gaming market? Can you imagine if Intel threw billions of dollars at Rocksteady, Rockstar, Blizzard instead and all of those titles would be 100% optimized for Intel's graphics pipeline/drivers?

For example, I did buy 8800GTS over X1900 series for Crysis 1 because at that time NV's option was superior. However, if I knew ATI or NV set up "marketing alliances" and as a result put a bunch of closed black box code into say Far Cry 1 or Crysis 1 that purposely hurt optimization for the competitor and Intel's graphics, I would either not buy the game or wait until it hits $5. So your example makes no sense because we never had a situation before where a hardware vendor provided source code that could NEVER be altered, shared, modified in any way without NV's/AMD's permission. NV's old TWIMTPB was much like AMD's GE today where all of the code is shared which meant even if the firm worked with the developer directly, Intel and AMD would eventually have access to make their own optimizations. If AMD followed NV's GW's and started doing the same, all those games I'd stop buying or waiting until they hit $5.

However, starting with AC DX10.1 fiasco, NV stopped playing fair imo. AMD could have easily hid all the DirectCompute code they used in every single AMD GE title from NV/Intel but did they? No. How some people don't understand the difference between NV's TWIMTPB/AMD's GE vs. NV's new GWs is to this day eye-opening.

Now, if you don't care about business ethics in gaming software development, no problem, but let's not try and pretend that this has existed in PC gaming for decades and that PC gamers have "changed to a new stance." I personally have no problem if Intel/AMD/NV work closely with developers to optimize the game's code for their GPU architectures, but if so, all of the code must be shared with everyone. Otherwise, whoever has more engineers and more marketing $ automatically wins. I don't consider this "fair competition". In that case, Intel's graphics could have buried both NV and AMD a long time ago if we go down that path.



Maybe I have much higher standards than most PC gamers or I can separate gameplay from graphics of my favourite franchisees more easily, but I haven't been impressed with graphics of any new game released since Crysis 3/Ryse Son of Rome. DAI, GTA V, Dying Light, AC Unity, Alien Isolation, etc. - nothing special that we haven't seen before imo. All of these look like current games or at best 1st gen wave for the PS4 generation, not next gen PC games. There is absolutely no graphical next gen leap in any of those relative to Crysis 3 and Ryse Son of Rome. The graphics in the Order 1886 blow GTA V away for example and yet most PC gamers are salivating over GTA V's graphics. It's not even close, yet PC gamers won't acknowledge this because to most of them "consoles suck".

If we look at the trailers for Batman AK, the graphics don't look impressive at all. Poor textures, last generation character movement/physics models, lighting that doesn't look next gen in any way.
https://www.youtube.com/watch?v=V-DBvDejInI

2 years ago, UE4 showed us what next gen real time PC graphics should look like. Thus far, not a single game since Crysis 3 lived up to this hype. None. GTA V or AC Unity or Batman AK are so far behind "next gen" level of graphics, might as well consider them 1-2 generations away from this:
https://www.youtube.com/watch?v=dO2rM-l-vdQ

It's very disappointing imo. I think TW3 will be the first game where we'll say OK finally this is a first true next gen game.

You have a point, although tbh mostly I just skip your wall after wall of text.But come on, do you really think people are trying to run AAA gameworks titles on intel graphics?
 

Vaporizer

Member
Apr 4, 2015
137
30
66
I know, perhaps I am old fashioned but I don't believe in the sabotage theory at all. I think it is exactly how I have repeated many times, that nvidia spends extra time and man hours early on in these games and have optimized to the highest degree. There is nothing stopping AMD from working with these developers just as well, just like the did with rockstar.

The fact that developers can't share gameworks code gets turned into developers can't work with AMD at all. I do not believe that for one second. It is just as far fetched as saying nvidia sabotages AMD performance, what developer would allow that? That would be totally illegal anyway.
The game works features, such as HBAO+ is nvidia code that they spend their time and money implementing. These bolt on features are optional and half of them don't even work on AMD hardware at all. Every game I can think of are defaulted without nvidia features, the game in its vanilla form, just as the developer gave us. Even on nvidia HW, you have to enable them. And just as RS already posted above, these features aren't automatically better than what the developer was giving us anyway.

So what if nvidia has added an extra feature that no one forces you to use.
This is getting demonized as people play AMD as a victim.

It gets old after a while. We turn every single thread into the same thing. Batman AK will be a amassing game so it doesn't much matter how much a few overly eager people want to hate on it. It just gets old to hear the same song over and over. People who aren't even interested in getting the game are here to do what exactly? Fighting the good fight against evil nvidia, yeah!
Give me a break
Thing is that all Tech sites will bench the game with Gameworks enabled (which code is hidden to AMD) and usually AMD performs not well in these settings because they need longer time to adjust drivers working with these black boxes. Almost no Tech site will bench with ultra and Gameworks Effects off.
And based on such biased Benches People select their Video Card.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Throughout all of my years involved in PCs, i am talking since 1990s, i have seen people build entire PCs just to play a single game.

It isn't about an emotional attachment to a brand. I have both AMD and Nv GPUs. My last build was 670 SLI and now I am on 290 CF.

Whether or not a boycott catches on is not necessarily relevant to my decision on whether or not to buy a game. I personally do not want to support companies that are intentionally creating a schism in the PC gaming community.

I refused to own two consoles when I actually had them and I refuse to have two different brand GPUs in two different rigs to play games.

If Rocksteady and Ubisoft refuse to allow AMD to optimize code and intentionally sandbag updates for months for the benefit of Nvidia I will not purchase their game.

For me, I have a backlog of at least a dozen games right now anyways and I will not be crying that I could not play a Batman game. When I hear that Gamesworks games have finally been patched and AMD performance is decent then I might buy it.

These developers are making a gamble that they will get more money from Gamesworks than they would have from AMD customers and they are probably right, but again, it is my money and I will choose to buy the game or not.

That being said and back on topic, if DX12 support is patched in I think it will be much harder to cripple AMD. So likely I will be looking for reviews of the game in DX12. Probably a Steam Sale in December for me honestly.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Thing is that all Tech sites will bench the game with Gameworks enabled (which code is hidden to AMD) and usually AMD performs not well in these settings because they need longer time to adjust drivers working with these black boxes. Almost no Tech site will bench with ultra and Gameworks Effects off.
And based on such biased Benches People select their Video Card.

Vast majority of GPU purchasers do not read reviews. The informed consumer is very few and far between my friend.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
But come on, do you really think people are trying to run AAA gameworks titles on intel graphics?

1. Yes. Do you think almost none of 71% of Intel graphics card owners do not play AAA games? Lots of youtube videos online of gamers playing GW titles on Intel-powered laptops.

2. Intel graphics do not stand still. Why can't someone with a Cannonlake Iris GPU not want to play Far Cry 3/4, Batman games? It's not as if high quality games become not worth playing with time. A lot of classic games are still worth playing even when they are older.

3. Some people simply buy games later when they fall to $5-10. They might have a backlog or are budget gamers. These new AAA games we are talking about won't be new to them in 2-3 years. Refer to point #2. With time, all these modern AAA GW games will be 100% playable on Intel graphics cards in 5-10 years. A lot of PC gamers will buy them on GOG and so on. So it's not possible to escape the fact that GW hurts Intel gamers too.

I know, perhaps I am old fashioned but I don't believe in the sabotage theory at all. I think it is exactly how I have repeated many times, that nvidia spends extra time and man hours early on in these games and have optimized to the highest degree. There is nothing stopping AMD from working with these developers just as well, just like the did with rockstar.

It's not a theory, it's a fact. If you can't optimize for closed-source game code in a game, your performance is automatically sabotaged since you can't optimize the driver. Your performance for those parts of the source code will forever remain exactly as it was from day 1 until you release more powerful graphics cards. You'll only be able to optimize for other game code to improve performance. It also means CF not working until patches are released and there is nothing AMD can do to speed up CF in that case. Also, you aren't understanding how GW's works with your last statement. AMD does work closely with developers to optimize the game for their architecture, but all of that work is 100% transparent to the developer or NV. It can be shared, modified, altered by the developer or NV's driver. GW is not like that at all. Would you like it if AMD and Intel started shoving proprietary closed source game code into AAA games which would force you to buy an AMD and an Intel graphics card for those other games? That's exactly the path GW is going into.

GW is doing exactly what it was set out to do - forces the gamer to pick NV as the preferred option. For example, the OP is worried that his card wouldn't run the game well. If Rocksteady optimized the game equally for both AMD and NV, he wouldn't be as worried. Knowing the game is basically made 100% for NV, his worries are justified.
 
Last edited:

Vaporizer

Member
Apr 4, 2015
137
30
66
Vast majority of GPU purchasers do not read reviews. The informed consumer is very few and far between my friend.
So these People buy PC at the shop and get Information from the sales man which extract their information from the biased Reviews. So you cant deny or get rid of the effect that Gameworks has and what it was intended for.
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
I wonder how an i5 Alienware Alpha would handle this. Any thoughts?

Anyone? Bump...

--> Specifically the non-upgradeable GPU side:
NVIDIA® GeForce® GTX 860M+ GPU 2GB GDDR5
"The chosen graphics processor found in the Alpha is based on the GTX 860M - which, at an architectural level at least, is identical to the desktop GTX 750 Ti we reviewed earlier this year (both are based on Nvidia's GM104 chip design). Alienware reckons that this GTX 860M has been clocked higher than it ever has before, and although the firm won't be drawn on performance, it hopes that it will produce better results than the standard desktop 750 Ti."
Source: http://www.eurogamer.net/articles/digitalfoundry-2014-alienware-alpha-spec-analysis

Also:
http://gpuboss.com/gpus/ASUS-GeForce-GTX-750-Ti-vs-ASUS-GeForce-GTX-660
http://gpuboss.com/gpus/GeForce-GTX-860M-vs-GeForce-GTX-660
-As the 660 is the listed minimum req.

Based on the differences alone, it doesn't look good for the Alienware Alpha with AK "minimum" specs.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Im sure you'll be fine. Minimum specs are typically overstated pretty badly. You might have to tone down resolution and settings but you'll be able to run it
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |