One thing has been bugging me about game design trends over the past few years. Developers seem to have created an obsession with trying to replace your in-game avatar (a person) with some cheap badly made 1970's camcorder with arms & legs attached, then trying to squeeze in every negative 'effect' that mimics a cr*p camera. I'm referring to all the unrealistic "realism" effects that often reduce immersion more than they add to it. You know, stuff like:-
- "Myopia simulator" (Depth of Field)
- "Cataract simulator" (Motion Blur)
- "Glaucoma simulator" (Vignetting)
- "Severe migraine simulator" (Film Grain)
- "I love pretending to wear dorky anaglyph red & blue 3D glasses simulator" (Chromatic Aberration)
- "I love walking round holding up a pane of glass in front of my face simulator" (excessive lens flare or rain / blood trickling down the screen as if you're supposed to be 'seeing' water trickle down 6" in front of your own face).
I know a lot of this is "art style vs realism", but even then whilst I find "Bokeh photography" (DOF) is an aesthetically pleasing art style for static photo's (especially at night), it certainly doesn't carry over to "wall of massive fake blur is good" in video games to me:-
Depth of Field - I find DOF to be one of the most annoying effects in rendered games. It heavily blurs everything beyond x meters to force the player to focus attention on a spot to try and mimic a camera. The three problems with this are : 1. My avatar has not had his eyeballs removed and replaced with cameras (neither have I), 2. "Monovision camera blur on a 2D monitor" is not remotely how humans with stereoscopic vision perceive out of focus objects at distance (they are just as much "doubled" as blurred), and 3. They don't know where you (the player) are focusing & falsely assume you'll be constantly staring only at things within 10ft of your character. DOF works for static portrait photographs to print off and hang on a wall. It's also a natural inevitable effect of light captured through lenses (passive interaction movies & TV). But in real life, whenever you have a face to face conversation with someone on the street, everything else beyond 10m doesn't get turned into a "wall of blur", and you will in fact look around you into the background, "mid-ground", etc, all the time. Likewise there's the psycho-visual aspect to it - even when you aren't looking at the horizon, you do not perceive it as blurred because it isn't blurred when you do look at it and some of that memory gets overlaid onto your peripheral vision in the brain (that's also how many optical illusions work). Trying to "fake blur" everything in the background on a 2D monitor when you are actively trying to look at same background looks & feels far more unnatural & fake to me than switching the whole DOF off completely.
Motion blur - If your vision is blurry just from rapidly turning your head or running in a straight line, it sounds more like you've got a partially detached retina or early stage cataracts. And concussion based blur is more of a psycho-visual thing than purely visual effect, ie, you "feel" it in your head more than you see it. If you were caught in an explosion and flung hard against an object to the extent you could barely make out shapes of objects, you'd also probably spend the next 30mins trying to not be sick or stand up without falling over. Obviously it wouldn't be "fun" to replicate that, but then that's also why trying to fake only the blurred vision effect looks silly. The only real use is trying to hide the judder in consoles locked to 30fps (as in 24fps movies). At 60-120fps on the PC though, it often looks more stupid to have "silky smooth blur" than it adds any immersion.
Vignetting - I don't get it. Why is my in game avatar superhero constantly holding up an oval shaped photo frame in front of his face? Or maybe he's making hand glasses? LOL. The only time you'd see the edges of your vision darken for absolutely no reason the way some games portray is due to mid-stage glaucoma.
Film grain - Why? Nothing has been "filmed" involving silver halide based storage media in 3D rendered games. Mass Effect had this and it looked utterly absurd. Old films also had mono sound, sepia tints, 24fps locks and splotches on the screen, should we add those in too?
Chromatic Aberration - Probably the dumbest modern game effect there is. Real life CA in modern lenses is extremely small and what little there is is corrected via post processing anyway. Do you see it on TV broadcasts? In most movies? Do you see the edges of buildings split up into red / blue with your eyeballs (even though your eyeball's 3 different types of cone cells have different natural sensitivity to different color wavelengths and the entire lot is filtered out by your brain)? Of course not. The only time you'll encounter significant CA is with heavy magnification related divergence such as satellites, telescopes and microscopes (and even then most of that too is corrected in software both post-processing and pre-calibration). So not only is this effect completely unrelated to rendered gaming or what the human eyeball sees (or what your in-game avatar should be seeing), it's not even something you see to the same exaggerated degree with the bulk of movie camera lenses either.
Am I alone in thinking "Less is more" and "subtle realism" should be the guiding principal of post-processing FX, not the "fake camera lens in video game" equivalent of the "Michael Bay mindset" of CGI-ing in a 5,000 Megaton nuclear explosion every time a skateboard collides with a bicycle? Anyone else turn a lot of this cr*p off or down by default for non-performance related reasons (simply because it looks less real / more natural)? Seriously, if Geralt / the Dragonborn, etc, are all half-blind myopics (with tunnel vision on top) barely capable of making out an ordinary tree 25 metres away whilst talking to someone 2m in front of them and require heavily tinted spectacles to stop the edges of objects breaking up like a prism, they really shouldn't be out adventuring at all... :biggrin:
- "Myopia simulator" (Depth of Field)
- "Cataract simulator" (Motion Blur)
- "Glaucoma simulator" (Vignetting)
- "Severe migraine simulator" (Film Grain)
- "I love pretending to wear dorky anaglyph red & blue 3D glasses simulator" (Chromatic Aberration)
- "I love walking round holding up a pane of glass in front of my face simulator" (excessive lens flare or rain / blood trickling down the screen as if you're supposed to be 'seeing' water trickle down 6" in front of your own face).
I know a lot of this is "art style vs realism", but even then whilst I find "Bokeh photography" (DOF) is an aesthetically pleasing art style for static photo's (especially at night), it certainly doesn't carry over to "wall of massive fake blur is good" in video games to me:-
Depth of Field - I find DOF to be one of the most annoying effects in rendered games. It heavily blurs everything beyond x meters to force the player to focus attention on a spot to try and mimic a camera. The three problems with this are : 1. My avatar has not had his eyeballs removed and replaced with cameras (neither have I), 2. "Monovision camera blur on a 2D monitor" is not remotely how humans with stereoscopic vision perceive out of focus objects at distance (they are just as much "doubled" as blurred), and 3. They don't know where you (the player) are focusing & falsely assume you'll be constantly staring only at things within 10ft of your character. DOF works for static portrait photographs to print off and hang on a wall. It's also a natural inevitable effect of light captured through lenses (passive interaction movies & TV). But in real life, whenever you have a face to face conversation with someone on the street, everything else beyond 10m doesn't get turned into a "wall of blur", and you will in fact look around you into the background, "mid-ground", etc, all the time. Likewise there's the psycho-visual aspect to it - even when you aren't looking at the horizon, you do not perceive it as blurred because it isn't blurred when you do look at it and some of that memory gets overlaid onto your peripheral vision in the brain (that's also how many optical illusions work). Trying to "fake blur" everything in the background on a 2D monitor when you are actively trying to look at same background looks & feels far more unnatural & fake to me than switching the whole DOF off completely.
Motion blur - If your vision is blurry just from rapidly turning your head or running in a straight line, it sounds more like you've got a partially detached retina or early stage cataracts. And concussion based blur is more of a psycho-visual thing than purely visual effect, ie, you "feel" it in your head more than you see it. If you were caught in an explosion and flung hard against an object to the extent you could barely make out shapes of objects, you'd also probably spend the next 30mins trying to not be sick or stand up without falling over. Obviously it wouldn't be "fun" to replicate that, but then that's also why trying to fake only the blurred vision effect looks silly. The only real use is trying to hide the judder in consoles locked to 30fps (as in 24fps movies). At 60-120fps on the PC though, it often looks more stupid to have "silky smooth blur" than it adds any immersion.
Vignetting - I don't get it. Why is my in game avatar superhero constantly holding up an oval shaped photo frame in front of his face? Or maybe he's making hand glasses? LOL. The only time you'd see the edges of your vision darken for absolutely no reason the way some games portray is due to mid-stage glaucoma.
Film grain - Why? Nothing has been "filmed" involving silver halide based storage media in 3D rendered games. Mass Effect had this and it looked utterly absurd. Old films also had mono sound, sepia tints, 24fps locks and splotches on the screen, should we add those in too?
Chromatic Aberration - Probably the dumbest modern game effect there is. Real life CA in modern lenses is extremely small and what little there is is corrected via post processing anyway. Do you see it on TV broadcasts? In most movies? Do you see the edges of buildings split up into red / blue with your eyeballs (even though your eyeball's 3 different types of cone cells have different natural sensitivity to different color wavelengths and the entire lot is filtered out by your brain)? Of course not. The only time you'll encounter significant CA is with heavy magnification related divergence such as satellites, telescopes and microscopes (and even then most of that too is corrected in software both post-processing and pre-calibration). So not only is this effect completely unrelated to rendered gaming or what the human eyeball sees (or what your in-game avatar should be seeing), it's not even something you see to the same exaggerated degree with the bulk of movie camera lenses either.
Am I alone in thinking "Less is more" and "subtle realism" should be the guiding principal of post-processing FX, not the "fake camera lens in video game" equivalent of the "Michael Bay mindset" of CGI-ing in a 5,000 Megaton nuclear explosion every time a skateboard collides with a bicycle? Anyone else turn a lot of this cr*p off or down by default for non-performance related reasons (simply because it looks less real / more natural)? Seriously, if Geralt / the Dragonborn, etc, are all half-blind myopics (with tunnel vision on top) barely capable of making out an ordinary tree 25 metres away whilst talking to someone 2m in front of them and require heavily tinted spectacles to stop the edges of objects breaking up like a prism, they really shouldn't be out adventuring at all... :biggrin: