Why add back problems caused by the physical limitations of optical cameras FFS?
Lens flare? We’ll just digitally add it back in.
Chromatic aberration? Add it to make it look like you’re looking though a cheap lens.
Vignette, check.
Urgh. I’m supposed to be there literally in the game, experiencing things through my own eyes.
My eyes are not cameras. Well they are but not like that.
I think the idea is that it is meant to simulate a camera, when you aren’t in first person view. You don’t see the world front few feet above your shoulders after all–you’re probably used to seeing views like that through a real camera where these things actually occur
Oh, I get that aspect. It’s just an objectively better experience without the artifacts of the technical limitls of a physical camera and lens.
It’s as if it’s driven by an idiot that thinks if it looks like there are lens flares, abberation, vignette, etc. that it was look cinematographic, completely ignoring the actual art of composition, framing, lighting, depth of field, etc… the actual arts of cinematography.
Though, given that I’m the one controlling the vital camera with my mouse or controller, apparently it should suck as much as a real camera.
Objectively is becoming the new literally
I used objectively literally.
Avoiding flares, aberration etc makes an image objectively better. You might subjectively prefer either the objectively better or objectively worse image.
Lol it’s literally an “objectively better experience” and if your experience was different then you’re literally objectively wrong.
Look sorry about the sass, I know know I’m being a pedantic ass right now. But experience is by definition subjective. If you specified that the image clarity was objectively better, well then you’d be totally right. But that’s not what you said.
I’m sorry you’re being a pedantic ass too.
I think introducing imperfections can in some cases enhance immersion. Our eyes do function like cameras, and have their limitations, but we don’t notice them so much. So, I think these “flaws” in games can make them more convincing in a way. I guess it’s a personal preference, I like it, but I see why it can be annoying.
I’m still using my eyes!
But I have… Special eyes!
Just like a movie theater, people are used to 24fps in a movie and anything else makes it seem weird and less dreamlike to transport them into the world. (But games aren’t 24fps movies, I know. Not the point)
When you clean up all of the visual post processing, the game will look extremely clean. Which makes it feel like it’s missing some kinda extra polish. People are so used to all of these elements added for a grounded and dirtier experience that without them it looks, and more importantly, feels too game-y for Ubisoft. (Counter-Strike is super clean, for example)
Look at Resident Evil 2 Remake and you see every single cinematic option in the book, down to lens distortion, being used and being able to be turned off in the settings. It’s the look and feel the studio wants to go for.
Glad there isn’t even one picture to show what the effect is. Are they talking about lens flare or something similar but different?
It is more or less a color halo outlining everything. It was supposed to simulate the subtle visual distortion of older lenses i.e cameras but… who the hell even wants that?
I will never understand what asshole thought adding chromatic aberration into EVERY GAME EVER was a good idea.
Probably someone that also likes CBT and the show Big Brother.
Every gimmick for verisimilitude gets abused to hell and back. We just gloss over the ones that are less frustrating to the goal of… lookin’ at stuff.
Destiny and Warframe are awash in gold because physically-based shaders made metals look super good. Ambient occlusion was egregious after Crysis, but games without a little bit feel weird now, and even the Wii got coerced into doing it efficiently. HDR tone-mapping was part of the brown-and-bloom era, but it’s still here and you’d never think twice about it.
Lens flare is more common than ever, but much better than its goofy line-of-sprites roots in the 90s, because you blur the whole screen and flip it. It doesn’t have to be blinding to be obvious and… aesthetic.
“God rays” and participating media / volumetric fog have been admirably restrained, considering how stupidly pretty they look, and the fact PS3 launch titles figured out you can just do it badly and blur. Downright awful sampling works so long as it’s different awful sampling from nearby pixels. Even Quake 3 did some sparse approximations on the CPU. I guess thick fog is just undesirable to developers, now that it’s not disguising tiny worlds or keeping framerates tolerable.
Unfortunately we can probably expect shakycam to take off after Unrecord. That game does a ton to look shockingly realistic, but a lot of companies will overdo about half of its tricks, and not understand why their playtesters have such queasy stomachs.
They added it because they can and I am told that it was easy to do.
There’s nothing wrong with cock and ball torture.
As an amateur astronomer with a strong eyeglass prescription, chromatic aberration is the bane of my existence. I get why they try to simulate a camera, but the more I can avoid the pitfalls of cheap low quality lenses, the better–I already have two of them on my face all the time
Same, I even pay hundreds of dollars out of pocket to have glass lenses cut because I legitimately don’t understand how people deal with the chromatic distortion and starburst effect that come with the high refraction plastic.
I mean, I do get it - people just don’t know any better. What I don’t get is why a literal doctor of optometry will look at you like you’ve got three heads when you start asking about the superior optic properties of glass.
Like Bloom in the 7th gen it was the style at the time. Someone at the time had a shitty idea that the “camera” in games should mimic cameras (bad ones) and I guess some exec liked it and was spread along all AAA games.
I guess now we’re going back on that like we did with “brown and grey = realism” fad.
I do see bloom and light halos/rays at night thanks to an astigmatism
I don’t understand this need to make games look like they’re filmed on a camera, it kills the immersion for me. I’m playing a fantasy game, I don’t want it to look like I’m watching a shitty video. Some games also do this thing where going from dark to light makes the screen super white so you can barely see for a second, cameras do that very noticably but eyeballs don’t
The only games that do the whole “blinded by the sun” thing worth a damn are the Fallout games, and it makes sense because you’ve never seen the sun in your life.
Everywhere else it needs to piss off.
Been playing dead island 2 and every time you go from inside to daylight, the whole screen essentially goes white and I hate it
It’s especially annoying when racing games do that shit.
What’s really fucking stupid is that they’ll make it so it takes forever for your “sight” to adjust when going from dark to light but driving into a tunnel is near-instant perfect night vision. That is how neither vision nor camera works. Not only that, but you can always see further into tunnels than you possibly could in real life.
Eyeballs aren’t as bad as cameras, but they definitely also do that.
You’re still using your eyeballs!
The correct way of implementing chromatic aberration would be like the one on the “corrected” side. There is still some, but it really is subtle.
Anyway, I don’t think games are a good target for chromatic aberration. It’s really meant for photorealistic scenes, mainly photorealistic renders, that give a sort of uncanny valley effect without it.
But once again - it looks stupid if your scene is not photo-realistic in the first place.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Is it supposed to highlight the “mirage-ness” of it all or something?
It is supposed to mimic low quality cameras. Chromatic abberation occurs because different colors of light focus at slightly different distances from the lens. This is the same effect that causes prisms to “split” white light into its component colors. i.e the angle light is bent depends on its wavelength/color. Newer, more expensive cameras have various means of either cirrecting for or avoiding the problem.
Putting on my “that guy” hat here…
The quality has nothing to do with it. Even very high end lenses can exhibit chromatic aberration under certain circumstances. Have a look at any sports broadcast. Once you see it, you can’t stop, and the lenses on those cameras are decidedly NOT low quality. Or price. https://www.bhphotovideo.com/c/product/1314025-REG/canon_uj86x9_3b_p01_dss_uhd_digisuper_86_broadcast.html
That said, even low-end lenses from the past decade or so have far less chromatic aberration than top-tier glass from decades back. I have an old Canon telephoto that produces crazy color fringes on anything and everything if I’m not careful, but my new cheapass Lumix zoom only does so in pretty extreme situations.
It’s definitely a good time to be a photography nerd.
It took me way too long to spot the difference.
In games it’s usually way less subtle
Clickbait title doesn’t say what the feature is and says it’s being removed for all players, not just disabled by default.