Yeah, it's pretty bad. I'm not even sure you can mod it, if it's hard coded? TBH, there used to be a time when I spent hours researching that stuff. Now if a game is that badly broken, I just move on.
I know what you're saying, but how that "effect" affects people is entirely relative. I don't feel "claustrophobic" when I see it, I just constantly subconsciously think "half of my screen is missing for no reason". It's like one of those old early letterboxed (non-anamorphic) "widescreen" DVD's that were nothing more than 4:3 full frame with black bars inserted inside the frame that gave the illusion of widescreen on a 4:3 set - until you watched it on a screen that matched the "apparent ratio"... (See earlier screenshot by NTMBK with black-bars around all 4 screen edges for end result).Your missing the point , it's meant to be like that to make you feel claustrophobic. Like when you crouch your fov is even smaller
I know what you're saying, but how that "effect" affects people is entirely relative. I don't feel "claustrophobic" when I see it, I just constantly subconsciously think "half of my screen is missing for no reason". It's like one of those old early letterboxed (non-anamorphic) "widescreen" DVD's that were nothing more than 4:3 full frame with black bars inserted inside the frame that gave the illusion of widescreen on a 4:3 set - until you watched it on a screen that matched the "apparent ratio"... (See earlier screenshot by NTMBK with black-bars around all 4 screen edges for end result).
A lot of horror effects are highly relative. Dark moody textures, creepy soundtrack, plentiful shadows, bloom, HDR & "fright-jumps", etc, are great, but to me, black bars & 30fps are gimmicks that add nothing. Same goes even for non-horror effects (eg, Depth of Field & motion blur which often look bad more than they look good). Thief 3's (Deadly Shadows) Shalebridge Cradle had a more unnerving mood / feel to it than many dedicated 'scary' games - and that wasn't even a horror genre game (or the best Thief game for that fact).
Maybe it uses AVX code? that may explain why it runs faster on haswell
It would have to more specific run 256bit AVX or AVX2 code to be faster on Haswell.
But we do miss Ivy in the mix to even make an estimate about it. But again, we are talking about a game that devs confirms is broken beyond 30FPS.
It's not broken , it's set at that frame rate for a reason
The reason is that it breaks at any other framerate...
Your not meant to change it , not every game has to run at 60fps you know ... and if it doesn't that does not = "bad port"
This game has it's own style and I don't think you should change anything about the frame rate or the aspect ratio. Play it how the director intended you to play it
Your not meant to change it , not every game has to run at 60fps you know ... and if it doesn't that does not = "bad port"
This game has it's own style and I don't think you should change anything about the frame rate or the aspect ratio. Play it how the director intended you to play it
I will accept that running it at 21:9 aspect ratio is an artistic choice. But coding it so badly that on a 21:9 monitor it letterboxes both horizontally and vertically? That's a bad port.
I'm not arguing, but there is no small debate about weather higher than 24fps movies looks "less cinematic" for actual movies. Might be related. Or not.
Maybe it's PCIe bandwidth? Sandy Bridge was PCIe 2.0, Ivy Bridge was PCIe 3.0. If the game is swapping textures all the time (due to being designed for a unified memory console), that might explain the difference.
A lot of people don't like the current spate of 30fps "artistic choice" PC games as it coincides with a general trend of watered down Lowest-Common-Denominator bad 30fps PC ports due to weak console hardware with AAA devs hiding behind the excuse of "artistic choice" (for games unrelated to 'horror genres') so as to not 'offend' console gamers. Whether this particular game Evil Within was done for that reason or is simply coincidental, either way - it just sets a bad precedent. The last thing PC gamers need is a general 30fps "bandwagon" trend, even if some games were written for 30fps for genuine reasons other than performance or 'platform parity'.Too many people are quick to shout "bad port" these days , when that is clearly not true
A lot of people don't like the current spate of 30fps "artistic choice" PC games as it coincides with a general trend of watered down Lowest-Common-Denominator bad 30fps PC ports due to weak console hardware with AAA devs hiding behind the excuse of "artistic choice" (for games unrelated to 'horror genres') so as to not 'offend' console gamers. Whether this particular game Evil Within was done for that reason or is simply coincidental, either way - it just sets a bad precedent. The last thing PC gamers need is a general 30fps "bandwagon" trend, even if some games were written for 30fps for genuine reasons other than performance or 'platform parity'.
In short - a lot of people are very wary of the following "slippery slope":-
"Evil Within is set at 30fps due to artistic choice"
Ubisoft : "Yes, yes, we're doing the same with AC Unity, the Crew, the Division, etc, and maybe all future games - it's all out of 'artistic choice' we tell you, and has nothing whatsoever to do with the fact the consoles we're writing them for couldn't even manage a consistent 60fps if we wanted them to..."
EA : "Oh yes, that's why we fixed Need For Speed Rivals at 30fps, everyone knows high-speed racing games look more 'arty' at 30fps than play smoothly and responsive at 60fps."
Sure guys, we believe you... :sneaky: :thumbsdown:
Maybe it's PCIe bandwidth? Sandy Bridge was PCIe 2.0, Ivy Bridge was PCIe 3.0. If the game is swapping textures all the time (due to being designed for a unified memory console), that might explain the difference.
I'd be very surprised if PCIE 2.0 16x wasn't enough bandwidth.
http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/
Even PCIE 2.0 8x doesn't seem to suffer much in the tests above. Obviously this is a new game ported from the new consoles, so it could be more demanding in that way. But I wouldn't think it should matter too much.
Don't buy the game then if the frame rate bothers you. It's really that simple. The sense of entitlement from PC gamers is getting ridiculous now , this game was going to be 30fps as that is what they wanted to do.
I don't agree with lower frame rate = more cinematic but for the evil within it really does not matter. You don't notice as the game (on my PC) never drops a frame
Folk will be screaming bad port over this game but in reality it's the best version of the game by far. How can the best version of a game be a bad port ? Sure we would all love extra work done on the PC versions of games but lets face it , they are not going to as it costs money that they will never get back.
Expecting something as trivial as an unlocked framerate in a $60 pc-game is entitlement now, awesome.
yes it is
The developer wants the game to run at 30fps , they decide not us
Buy it or don't
I can assure you, I have no interest in buying any game where the devs claim 30fps is somehow a better experience. It's a ridiculous argument that has been disproved time and time again.
That some people actually buy into that nonsense is even sadder. I'm sure if a dev comes around and says a 640*480 resolution delivers a more cinematic feel because it reminds of good old VHS some people will support that too.
http://30vs60.com/
http://www.30vs60fps.com/
http://www.testufo.com/#test=framerates
http://www.pcgamer.com/the-features-pc-gamers-wantan-open-letter-to-developers-and-gamers/
This game does not need to run at 60fps , it's slow paced and you don't notice the frame rate at all. The director guy clearly had a vision of what game he wanted to make , and he has made that game
You or I might not agree with his choices but he's stuck by them and the game is actually pretty good and well polished.
Either buy the game or don't , it is what it is