Posted on December 15th, 2017 by Gio
Lights that move and change over time can add a whole new dimension to your 2D games. As it turns out, it's fairly easy to do.

I have previously explored the idea of using 3D lighting on some sprites, in my post about 3D-looking objects in a 2D scene.

In that post I used some maths to essentially draw a sphere (with 3D lighting) on the screen. This involved calculating the normal vector of the spherical surface for each pixel. Which for a sphere is pretty easy. But what about doing that for arbitrarily complex objects?

We are going to do just that here. I have create a simple demo scene with 3 objects (the normal maps have been created in a different way for each object), so you can see what it looks like. Move the mouse over the scene below to move the light.

The cat is the one that looks the best, because it was a 3D object to start with. It was then rendered twice: once to generate a 2D sprite as usual, and once to generate a normal map, like this:

There are several guides and tutorials to replicate this process in most 3D authoring tools, for example this one for blender, or you could do it in 3ds max using this free plug-in.

What's especially interesting though, is that you can also approximate this process for hand-drawn, purely 2D sprites. The gist of it, if you want to know how it works (but really you don't care because there is software that does this for you automatically), is that you can run some edge-detection algorithm first, and that gives you an idea of where there may be discontinuities in your source image. Based on that and on the brightness of each pixel, you can then infer a height gradient for the pixels near the edges, and therefore calculate approximate normals.

If you google a bit you can probably find a tutorial to do this in your favourite image editor. There is a popular plug-in by Nvidia to do this in Photoshop. Here, however, I'm going to show you how to do it in GIMP, which is cool because it's free.

First of all you'll need to download this normalmap plug-in for GIMP. Inside the zip you'll find a readme.txt file that tells you where to put the other files. If you are on Windows, normalmap.exe goes into C:\Program Files\GIMP 2\lib\gimp\2.0\plug-ins (or wherever you installed GIMP) and the DLL files go into C:\Program Files\GIMP 2\bin.

After doing that, the next time you open GIMP you will have a new filter, aptly called Normalmap.

That pops up a dialog where you can change the edge-detection filter to use, and a few other parameters. In the scene above, I have used Sobel 3x3 on the flowers and Prewitt 5x5 on the tropical fruit. As you can see, a 5x5 filter (and Prewitt in particular) yields an overall sharper result. Which may or may not be a good thing depending on your art style.

Now we know how to generate a normal map, but how do we use one in our 2D games? It's time for some shader magic.

I will be doing this in WADE, which like GIMP, is cool because it's free. Also it's cool because it's awesome. Seriously. But you can do exactly the same thing in any other game engine / framework that supports shaders. The only things that you will need are:

• UV coordinates accessible from your shader code. WADE gives these to you in the x and y components of a vector called uvAlphaTime
• A texture sampler that contains the main (color) texture for your sprite. In WADE this is called uDiffuseSampler
• Another sampler, that we will call normalSampler, that contains your normal texture. In the WADE Sprite editor you'd set it up like this:

While you're there, do make sure that your sprite is set to Always Draw. This is important otherwise things won't work.

Now let's start writing some shader code, assuming (temporarily) that the light is coming from a constant direction. This means that we have an infinitely far away light source, for example the sun. We will define a constant vector called l for the light direction.

```const vec3 l = normalize(vec3(0.5, -0.5, 1.0));
```

Feel free to use any other value for the light direction. However it's important that you always normalize it. Then we proceed to read the color and normal values from our textures:

```vec4 normalMap = texture2D(normalSampler, uvAlphaTime.xy);
vec4 color = texture2D(uDiffuseSampler, uvAlphaTime.xy);
normalMap.xy *= 2.;
normalMap.xy -= 1.;
```

As you can see we also rescale the x and y components of our normal vector, multiplying by 2 and subtracting 1. This is because, in the red and green channels of the normal map, we have color values ranging from 0 to 1, which, in fact, represent vector components that are meant to range from -1 to 1.

The next step is to calculate the lighting for each pixel. Like we did for the sphere in my other blog post, we will use a simple NdotL lighting model. That means that we take the dot product of our light vector and our normal vector to calculate the amount of light received by each pixel. This will then modulate the intensity of our color texture (we do this for the RGB channels, we leave alpha alone).

```float light = dot(normalMap.xyz, l);
vec4 lit = vec4(light * color.xyz, color.w);
gl_FragColor = lit;
```

And that's it - that's all we need for our directional light shader. Neat and super simple.

In the demo above, however, I am using a point light: not an infinitely far away light source, but a light that is part of the scene. This is potentially more useful, in several different game scenarios. And it's just a little bit more work to do, not much really.

The difference with the directional light, is that we will now have a different light direction for each pixel in our sprite. So can no longer define l as a constant. Instead, we need to add a new shader uniform that we are going to call lightPosition, of type vec3, that represents the light position.

Now we could do this in any coordinate system, as long as we always use the same coordinate system for everything. I find it most natural to do everything in UV-space, that is the texture-coordinate space that normally ranges from (0, 0) for the top-left corner of our sprite, to (1, 1) for its bottom-right corner. Choosing this coordinate space makes all the calculations very simple, however how do we know what the light position is in this coordinate space?

We need to set this value from outside our shader - this must be calculated in our application, on the CPU. This is what the lightPosition vector is for. Now this will be different in every game engine, but I can tell you what I did in WADE for the demo above.

First of all I created a SceneObject called light that represents our light source. I made it follow the mouse by adding an onMouseMove function to the app (i.e. I added this code to app.js, after line 12):

```this.onMouseMove = function(data)
{
_.light.setPosition(data.screenPosition);
};
```

Then for each 3D-lit object in my scene, I used the shader code and the shader uniforms as explained above, and added this onUpdate function:

```var sprite = this.getSprite();
var size = sprite.getSize();
sprite.lightPosition[0] = diff.x;
sprite.lightPosition[1] = diff.y;
sprite.lightPosition[2] = 1;
```

This transforms the light position into UV space, essentially by calculating the difference between the light position and the top-left corner of our sprite (in world space), then dividing by the size of the sprite. I am also, arbitrarily, setting the z component of the light position to 1, meaning that the light is always somewhere on the viewer's side of the screen. You can change that, but always use a positive value - you don't want the light to be on the opposite side of the screen, otherwise everything would appear black.

Finally, we need to modify our shader to use this light position. As it turns out, it's easy-peasy: instead of defining a constant l, we now use the difference between the light vector and our UV vector.

```vec3 l = normalize(lightPosition - vec3(uvAlphaTime.xy, 0.));
```

And that's it, a fantastic, dynamic point light in your 2D scene. I hope you can make something amazing with it. As usual, feel free to download the full WADE project and play with it.