XNA 4.0 Shader Programming #4–Normal Mapping

 t4_2

Welcome back to the XNA Shader Programming series. I hope you enjoyed the last 3 tutorials, and have started to get a grip on shaders!
Last time we talked about Specular lighting, and how to implement this in our own engines. Today I’m going to take this to the next level, and implement Normal Mapping.

Technique: Normal mapping
Normal mapping is a way to make a low-poly object look like a high-poly object, without having to add more polygons to the model. We can make surfaces, like walls; look a lot more detailed and realistic by using the technique described next.

t3
Figure 4.1a – Scene without Normal mapping on Zombie, with Normal Mapping on the background

image
Figure 4.1b – Scene with Normal mapping on both Zombie and Background

An easy way to describe normal mapping is that it is used to fake the existence of geometry.

To compute normal mapping, we will need two textures: one for the color map, like a stone texture, and a normal map that describes the direction of all the normals on a surface. Instead of calculating the Light by using Vertex-normals, we calculate lighting by using the normals stored in the normal map.

clip_image003
Figure 4.2 – How normal mapping works

Fig.4.2 shows a simple example on normal mapping. We got a stone texture and a normal map, a shader combines these and applies it to a completely spherical surface. But, the Normal map simulates some holes and bumps in the stones surface, making it look less round. As you can see, the edges of the sphere reveals our fake bumps since normal mapping does not physically displace vertexes on a surface.

Tangent space
Sounds easy, yes? Well, there is one more thing. In most Normal mapping techniques (like the one I’m describing today), the normals are stored in something that is called texture space coordinate system, or tangent space coordinate system. Since the light vector is handled in object or world space, we need to transform the light vector into the same space as the normals in the normal map.

To describe tangent space, take a look at figure 4.3.
clip_image004
Figure 4.3 – Tangent space

We will grab the tangents from the model file, and pass it in to the shader. The shader will then use this to calculate a matrix that will be used to transform the light calculations in tangent space.

To enable this, click the asset and open the tree for the Content processor and set the property that generates tangents to true:
image

Figure 4.4 – Content Processor

Lets take a closer look on how to implement this later, for now, let’s focus on adding a image file that will represent the color of each surface, textures! Textures can be used to add colors to a wall, a skin on a arm, a dragon-logo on the chest of a warrior and so on. A texture can be any image file, like .jpg, .bmp, .png,

To implement a basic normal map, we will need two textures. One texture for the color map on our model, and one texture that contains the normal map for the model.

Textures
To implement textures in HLSL, we need to create something that is called Texture samplers. A texture sampler, as the name describes, sets the sampler state on a texture. This could be info about how the texture should use filtering ( trilinear in our case ), and how the U,V coordinates of the texture map will behave. This can be clamping the texture, mirroring the texture and so on.

To create a sampler for our texture, we first need to define a texture variable the sampler will use:

texture2D ColorMap;
sampler2D ColorMapSampler = sampler_state
{
	Texture = <ColorMap>;
	MinFilter = linear;
	MagFilter = linear;
	MipFilter = linear;
};

So, we got a texture and a sampler, next is to put them into use.

Since we are using a pixels shader to map a texture to an object, we can simply create a vector to store the color information. Color got 3 or 4 channels, one for the red color, one for green, one for blue, and in some cases, one for the alpha value of the color. The alpha value represents the transparency of the color.

float4 Color;

New, we need to set the values in the Color variable to equal the color in our texture at texture coordinate UV.

In HLSL, this can easily be done by using a HLSL function called tex2D( s, t ); where s is the sampler, and t is the texture coordinate of the pixel we are currently working on.

float4 color = tex2D(ColorMapSampler, input.TexCoord);

Texture coordinates? Well, let me explain that. A texture coordinate is simply a 2D coordinate ( U,V ) that is stored at each vertex in our 3D model. It is used to map a texture onto the object and is ranging from 0.0 to 1.0.
clip_image001

Figure 4.4 – Texture Coordinates

With texture coordinates, the model can have textures assigned to different places; say an Iris texture on the eyeball part of a human-model, or a mouth somewhere in a human face.

As for the lighting algorithm, we will use Specular lighting in just the same way as earlier, but the normals are fetched from a texture instead of the vertex.

Implementing the shader

The biggest difference in this shader and the specular lighting shader is that we will use tangent space instead of object space, and that the normals used for lighting calculation will be retrieved from a normal map. So the first addition to the shader from the previous tutorial is to define the textures and the texture samplers:

texture2D ColorMap;
sampler2D ColorMapSampler = sampler_state
{
	Texture = <ColorMap>;
	MinFilter = linear;
	MagFilter = linear;
	MipFilter = linear;
};

texture2D NormalMap;
sampler2D NormalMapSampler = sampler_state
{
	Texture = <NormalMap>;
	MinFilter = linear;
	MagFilter = linear;
	MipFilter = linear;
};

 

Having the textures ready, we must add the Texture Coordinates to the Vertex Shader Input structure. In here, we also take the Normal, Binormal and the Tangent (generated in the model), ready for use in the Vertex Shader.

struct VertexShaderInput
{
    float4 Position : POSITION0;
	float2 TexCoord : TEXCOORD0;
	float3 Normal : NORMAL0;
	float3 Binormal : BINORMAL0;
	float3 Tangent : TANGENT0;
};

We will also need the Texture Coordinate as the input to the Pixel Shader, so we add it to the Vertex Shader Output-structure.

struct VertexShaderOutput
{
    float4 Position : POSITION0;
	float2 TexCoord : TEXCOORD0;
	float3 View : TEXCOORD1;
	float3x3 WorldToTangentSpace : TEXCOORD2;
};

The Vertex Shader

Now, in the vertex shader, a lot is still the same as before, but we need to fill the WorldToTangentSpace parameter so we can transform the normals and light-calculations to the correct space.

To implement the WorldToTangentSpace , we multiply each of the components with the World matrix.

VertexShaderOutput VertexShaderFunction(VertexShaderInput input,float3 Normal : NORMAL)
{
	VertexShaderOutput output;
	
	float4 worldPosition = mul(input.Position, World);
	float4 viewPosition = mul(worldPosition, View);
	output.Position = mul(viewPosition, Projection);
	output.TexCoord = input.TexCoord;

	output.WorldToTangentSpace[0] = mul(normalize(input.Tangent), World);
	output.WorldToTangentSpace[1] = mul(normalize(input.Binormal), World);
	output.WorldToTangentSpace[2] = mul(normalize(input.Normal), World);
	
	output.View = normalize(float4(EyePosition,1.0) - worldPosition);

	return output;
}

We start by transforming the position as usually.

Then we create a 3×3 matrix, WorldToTangentSpace, which is used to transform from world space to tangent space.

Basically, what we get from this vertex shader is the transformed Position, and a transformed Light and View vector based on the tangent space matrix. This is because, as mentioned earlier, the normal map is stored in tangent space. So to calculate a proper light based on the normal map, we need to do this to have all vectors in the same space.

So, now that we have our vectors in the right space, we are ready to implement the pixel shader.

The pixel-shader need to get the pixel color from the color map, and the normal from the normal map.
Once this is done, we can calculate the ambient, diffuse and specular lighting based on the normal from our normal map.

The code for implementing the pixel shader is also pretty much straight forward – we basically have to change the way the normal is created, and add the color from the texture to our algorithm, have a look at the code:

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
	float4 color = tex2D(ColorMapSampler, input.TexCoord);

	float3 normalMap = 2.0 *(tex2D(NormalMapSampler, input.TexCoord)) - 1.0;
	normalMap = normalize(mul(normalMap, input.WorldToTangentSpace));
	float4 normal = float4(normalMap,1.0);

	float4 diffuse = saturate(dot(-LightDirection,normal));
	float4 reflect = normalize(2*diffuse*normal-float4(LightDirection,1.0));
	float4 specular = pow(saturate(dot(reflect,input.View)),32);

    return  color * AmbientColor * AmbientIntensity + 
			color * DiffuseIntensity * DiffuseColor * diffuse + 
			color * SpecularColor*specular;
}

We get the color from the texture by using the tex2D(s,t) function. It will return the color located at the position t in texture s. We do the same with the normal, but making sure it will range from 0-1 instead of –1 to +1.

The rest is the same as before, except that we add the color value to the ambient, diffuse and specular light.

And that’s basically it! A lot of new concepts was introduced today. Play around with the example to make sure you understand what the variables do.

The entire shader is listed below.

// XNA 4.0 Shader Programming #4 - Normal Mapping

// Matrix
float4x4 World;
float4x4 View;
float4x4 Projection;

// Light related
float4 AmbientColor;
float AmbientIntensity;

float3 LightDirection;
float4 DiffuseColor;
float DiffuseIntensity;

float4 SpecularColor;
float3 EyePosition;


texture2D ColorMap;
sampler2D ColorMapSampler = sampler_state
{
	Texture = <ColorMap>;
	MinFilter = linear;
	MagFilter = linear;
	MipFilter = linear;
};

texture2D NormalMap;
sampler2D NormalMapSampler = sampler_state
{
	Texture = <NormalMap>;
	MinFilter = linear;
	MagFilter = linear;
	MipFilter = linear;
};

// The input for the VertexShader
struct VertexShaderInput
{
    float4 Position : POSITION0;
	float2 TexCoord : TEXCOORD0;
	float3 Normal : NORMAL0;
	float3 Binormal : BINORMAL0;
	float3 Tangent : TANGENT0;
};

// The output from the vertex shader, used for later processing
struct VertexShaderOutput
{
    float4 Position : POSITION0;
	float2 TexCoord : TEXCOORD0;
	float3 View : TEXCOORD1;
	float3x3 WorldToTangentSpace : TEXCOORD2;
};

// The VertexShader.
VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;

    float4 worldPosition = mul(input.Position, World);
    float4 viewPosition = mul(worldPosition, View);
    output.Position = mul(viewPosition, Projection);
	output.TexCoord = input.TexCoord;

	output.WorldToTangentSpace[0] = mul(normalize(input.Tangent), World);
	output.WorldToTangentSpace[1] = mul(normalize(input.Binormal), World);
	output.WorldToTangentSpace[2] = mul(normalize(input.Normal), World);
	
	output.View = normalize(float4(EyePosition,1.0) - worldPosition);

    return output;
}

// The Pixel Shader
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
	float4 color = tex2D(ColorMapSampler, input.TexCoord);

	float3 normalMap = 2.0 *(tex2D(NormalMapSampler, input.TexCoord)) - 1.0;
	normalMap = normalize(mul(normalMap, input.WorldToTangentSpace));
	float4 normal = float4(normalMap,1.0);

	float4 diffuse = saturate(dot(-LightDirection,normal));
	float4 reflect = normalize(2*diffuse*normal-float4(LightDirection,1.0));
	float4 specular = pow(saturate(dot(reflect,input.View)),8);

    return  color * AmbientColor * AmbientIntensity + 
			color * DiffuseIntensity * DiffuseColor * diffuse + 
			color * SpecularColor*specular;
}

// Our Techinique
technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

download Download Source (XNA 4.0)

This entry was posted in Tutorial, XNA Shader Tutorial. Bookmark the permalink.

10 Responses to XNA 4.0 Shader Programming #4–Normal Mapping

  1. xnakiwi says:

    float3 normalMap = 2.0 *(tex2D(NormalMapSampler, input.TexCoord)) – 1.0;

    Why do you multiply by 2 and take away 1?

    • allanrobertson7 says:

      you need to transform the texture color values (which are in the 0 to 1 range) to world space coordinates (which are -1 to 1)

  2. soso says:

    THANK YOUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU

  3. Pint says:

    Your tutorials are just great. The best XNA hlsl tutorials that I have found. They actually work and the solutions seem pretty elegant. Please do keep on posting also the rest tutorials in XNA 4.0 🙂

  4. Douwe says:

    Finally! It works! Thank you soooooo much!

  5. Taylor says:

    Thank you, works great but I’m getting strange artifacts (some sort of pixeling noise) for textures in distance. :/ Somebody has an idea, why?

  6. Hi
    Thanks a lot for the great tutorials they worked in my project but I have a big problem. I’ve made a XBOX 360 game with XNA 4.0 and the problem is that my levels contain about 160 3D objects. Now the shader (Normal Mapping) works great in the avatar selection since there are only 3 objects but in the level I have a very bad performance. ow can I achive that the performance is about 30 FPS?

  7. Too Bad says:

    Same problem as above, everything works perfect, but performance drops rapidly with some more objects

  8. nononono2832 says:

    Hello , i’m new to XNA
    if i don’t work with predefined meshes, and i generate then dynamically (generated dungeon),
    how can i obtain the tangent ??

  9. Michael says:

    I made my model with blender and when I ran the code the model was all black. My solution was to load each Texture2D using LoadContent and then SetValue for ColorMap and NormalMap like follows in my case:

    shinyJem = Content.Load(@”Effects/ShinyJem”);
    normalMap = Content.Load(@”Models/jemnormal”);
    texture = Content.Load(@”Models/jemtexture”);
    shinyJem.Parameters[“ColorMap”].SetValue(texture);
    shinyJem.Parameters[“NormalMap”].SetValue(normalMap);

    With love, Michael from BB Productions.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.