 A simple retro shader for WebGL games
Posted on November 13th, 2017 by Gio
What does it take to convert a high-res game into something with a lovely retro look? Can we just build our game normally and then apply a post effect to retro-ify our game?

Yes we can do that, or we can apply an effect to each sprite individually - the result is slightly different, but this technique works well in both cases.

Here you can see the same scene that I created for my perspective clouds blog post, with this retro effect applied as a post process shader.

The WebGL shader to achieve this is incredibly simple. Essentially, back in the day, we had lower resolutions and a smaller range of possible colors. Both these things are easy to simulate.

The key point is that we have to "quantize" the values that we want to use. Let's start with our texture coordinates (also known as UV coordinates) that we uso to sample our source texture.

Note: for simplicity I'm doing this as a post process effect in WADE (applied to a layer), but you could do this on each of your sprites individually if you prefer.

We have a uv vector, where both coordinates are in the range [0, 1[. But of course these values do not vary continuously from 0 to 1 - they vary in discrete increments for each pixel that we have on the screen.

So for example, let's say that we have a 1920x1080 resolution. The X coordinate for the left-most pixel, in UV space, is 0. The X coordinate for the next pixel is 1/1920. The next pixel's coordinate is 2/1920, etc.

To reduce the resolution by a factor of 10 for example, still sampling a 1920x1080 resolution, essentially we want the X coordinate of the first 10 pixels to be the same - 0. Then we want the next group of 10 pixels to use 10/1920 as their X coordinate. The following group of 10 pixels will use 20/1920, and so on.

The shader code to do this is straightforward - given a resolution vector (let's call it res and use a shader uniform for this - in this example we would set it to vec2(192., 108.)) we can simply do

```vec2 uv = uvAlphaTime.xy;
uv = floor(uv * res) / res;
``` This gives us the low-res effect that we are after: But what about color? What can we do to achieve a retro 16-bit effect?

It turns out that we can do a very similar thing. These days we use 32-bit colors - this means that for each pixel we have one byte (8 bit) for red, one byte for green, one byte for blue and one byte for alpha (transparency). In other words, we have 256 possible color values for each channel, and we have 4 channels.

Back in the day we had 16 bit per pixel, and if you're old enough you'll remember that we had 8 bit before that.

When we had 16 bit per pixel, there was no transparency information encoded in each pixel, i.e. everything was fully opaque. The 16 bit were subdivided like this: 5 bit for red, 6 bit for green and 5 bit for blue. Why 6 bit for green? Well for one thing, because you can't easily divide 16 by 3. And supposedly, the human eye is more sensitive to green light than it is to red and blue, hence more accuracy for the green channel.

So we have 32 possible color values for the red and blue channel, and we have 64 values for the green channel.

Similar to what we did before, we need to define a constant vector (we call it depth) and set it to vec3(32., 64., 32.). Then we use the same formula as above to quantize our colors:

```vec4 color = texture2D(uDiffuseSampler, uv);
color.xyz = floor(color.xyz * depth) / depth;
```

Obviously if you want a more retro effect, you can use lower values for res and depth.

So here's the complete shader (where we are using res as a shader uniform).

``` const vec3 depth = vec3(32., 64., 32.);

vec2 uv = uvAlphaTime.xy;
uv = floor(uv * res) / res;
vec4 color = texture2D(uDiffuseSampler, uv);
color.xyz = floor(color.xyz * depth) / depth;

gl_FragColor = color;
``` 