Simulating human night vision in 3D-games

As you know, in the dark  Human vision is provided by the cells of the retina, rods, high light sensitivity which is achieved due to the loss of visual acuity and color sensitivity (although the rods in the retina and more, they are distributed over a much larger area, so that the total "solution" comes less).




The first thing you need to decide what effect we want to achieve. I broke all the processing on the following parts:

Loss of color in areas of low light
Loss of visual acuity in the same place
Little noise (ibid)
Loss of visual acuity at a distance at the middle and low light


Will write under Unity3D Pro, a shader for postprocessing.
Before proceeding directly to the shader, write a small script that performs iterative runs of the screen buffer through this same shader:

using UnityEngine;

public class HumanEye : MonoBehaviour
    public Shader Shader;
    public float LuminanceThreshold;
    public Texture Noise;
    public float NoiseAmount = 0.5f, NoiseScale = 2;

    private Camera mainCam;
    private Material material;

    private const int PASS_MAIN = 0;

    void Start ()
        mainCam = camera;
        mainCam.depthTextureMode |= DepthTextureMode.DepthNormals;
        material = new Material (Shader);

    void OnRenderImage (RenderTexture source, RenderTexture destination)
        material.SetFloat("_LuminanceThreshold", LuminanceThreshold);
        material.SetFloat ("_BlurDistance", 0.01f);
        material.SetFloat ("_CamDepth", mainCam.far);
        material.SetTexture ("_NoiseTex", Noise);
        material.SetFloat ("_Noise", NoiseAmount);
        material.SetFloat ("_NoiseScale", NoiseScale);
        material.SetVector("_Randomness", Random.insideUnitSphere);
        Graphics.Blit (source, destination, material, PASS_MAIN);

Here we just set the shader parameters for user-defined and execute pererendering screen buffer through our shader.
Declare variables and constants:

sampler2D _CameraDepthNormalsTexture;
float4 _CameraDepthNormalsTexture_ST;

sampler2D _MainTex;
float4 _MainTex_ST;

sampler2D _NoiseTex;
float4 _NoiseTex_ST;
float4 _Randomness;

uniform float _BlurDistance, _LuminanceThreshold, _CamDepth, _Noise, _NoiseScale;

#define FAR_BLUR_START 40
#define FAR_BLUR_LENGTH 20

Vertex shader - standard and does not perform any unusual changes. The fun starts in the pixel shader.

First, select the value of the current pixel, and in addition to it - "fuzzy" value for the same pixel:

struct v2f {
    float4 pos : POSITION;
    float2 uv : TEXCOORD0;
    float2 uv_depth : TEXCOORD1;

half4 main_frag (v2f i) : COLOR
    half4 cColor = tex2D(_MainTex, i.uv);
    half4 cBlurred = blur(_MainTex, i.uv, _BlurDistance);

Getting "washed out" value function is carried out blur (), which fetches a few pixels in the neighborhood of our averages and their meanings:

inline half4 blur (sampler2D tex, float2 uv, float dist) {
    #define BLUR_SAMPLE_COUNT 16
        // generated by the random throw of float !
    const float3 RAND_SAMPLES[16] = {
                        .... 14 more vectors ....

    half4 result = 0;
    for (int s = 0; s < BLUR_SAMPLE_COUNT; ++s)
        result += tex2D(tex, uv + RAND_SAMPLES[s].xy * dist);
    result /= BLUR_SAMPLE_COUNT;
    return result;

Factor unlit pixels will determine the average value of brightness through three channels. The coefficient is cut from a given boundary value of brightness (LuminanceThreshold), all pixels lighter than this are considered "bright enough" that they can not handle.

half kLum = (cColor.r + cColor.g + cColor.b) / 3;
kLum = 1 - clamp(kLum / _LuminanceThreshold, 0, 1);

KLum dependence on the brightness will look something like this:

KLum value for our scene look like (white - 1, black - 0):

There is clearly seen that the bright areas (lights and lit grass) have kLum zero and our effect to them will not apply.

The distance from the surface of the screen pixel in meters can be obtained from the texture depth (depth texture, Z-buffer), which is clearly available when deferred-rendering.

float depth;
float3 normal;
DecodeDepthNormal(tex2D(_CameraDepthNormalsTexture, i.uv_depth), depth, normal);
depth *= _CamDepth; // depth in meters</pre>
KDepth factor will determine the degree of dark objects near and kFarBlur - all others away:
<pre class="lang:c decode:true">#define DEPTH_BLUR_START 3
#define FAR_BLUR_START 40
#define FAR_BLUR_LENGTH 20
half kDepth = clamp(depth - DEPTH_BLUR_START, 0, 1);
half kFarBlur = clamp((depth - FAR_BLUR_START) / FAR_BLUR_LENGTH, 0, 1);

Graphs of both factors on the distance look identical and differ only in scale:

Values kFarBlur:

Calculate the total pixel blur factor, based on the previous three:

half kBlur = clamp(kLum * kDepth + kFarBlur, 0, 1);

Dark pixels are blurred from a distance of several meters (DEPTH_BLUR_START), and distant objects - regardless of lighting conditions.

The degree of loss of color, we will be equal to the degree of "unlit» (half kDesaturate = kLum).

Now it is necessary to mix normal, blurred, black and white pixel and calculate the resulting color:

half kDesaturate = kLum;

half4 result = cColor;
result = (1 - kBlur) * result + kBlur * cBlurred;

half resultValue = result;
result = (1 - kDesaturate) * result + kDesaturate * resultValue;
return result;

However, if you look at the picture in the dynamics - you can see that something is missing. What? Noise!

half noiseValue = tex2D(_NoiseTex, i.uv * _NoiseScale + _Randomness.xy);
half kNoise = kLum * _Noise;

Here we choose a random value from the texture _NoiseTex (filled with Gaussian noise of Photoshop) using the provided script vector _Randomness, which will vary for each frame.
The resulting random value is mixed into our pixel:

result *= (1 - kNoise + noiseValue * kNoise);


All data posted on the site represents accessible information that can be browsed and downloaded for free from the web.


User replies

Looks very good. Thanks for sharing.