XNA Shader Programming
Tutorial 17, Point light + Self-Shadowing
Tutorial 17, Point light + Self-Shadowing
Hi, and welcome to Tutorial 17 of the XNA Shader Programming tutorial!
Today we are going to build on the Normal Mapping shader we made in tutorial 4. You don’t need to know Normal mapping before making a point light, so if you just want to know how to implement a point light, feel free to continue reading!
This tutorial will only explain what a point light is, and how to implement it so you won’t get distracted by the normal map. The algorithm used here is hen added to the normal mapping shader, or whatever shader you like( considering you learned the technique 😉 ).
It’s not really hard, as long as you understand tutorial 1,2 and 3. So if you have not done these three, you should do them before jumping on this one.
Source and executable can be found on the bottom of the article!
BTW Images will be remade once I get my softwre ( other than paint ) to work again, sorry for this.
Point light
Point lights( also named Omni light in some rendering tools ) is a light source where the light spreds out in every direction from a point in 3D space.
Fig 17.1
Unlike any of our previous directional lights, the point light can have a range, making it’s light rays die out after a certrain distance. This is called the attenuation factor:
Fig 17.2
If we subract the dot product of V1=L/r and V2=L/r ( V1.V2 ) from 1, we will get the attenution factor.
We want all objects to only have ambient light when outside of the range from our point light, so our final light equation for our point light looks like this:
I = A + Diffuse*Specular*attenuation
I = A + Diffuse*Specular*attenuation
Self shadowing
One problem we have in oulight equation is that we get some artifacts from our diffuse and specular light calculation. Pixels get lit when the light vector and the view vector is pointing the opposite directions. Also, when the light gets too close to the surface, we get artifacts as well.
A solution to this is to implement something called Self-Shadowing, that prevents pixels that should not be lit, to be lit.
The self-shadowing factor will be zero, or close to zero when the geometry is occluded/should not be lit, and above zero if the surface/pixel is within the conditions to be lit.
So, how do we do this? Yes, you guessed right. The dot product between the Normal of the surface and the Light direction:
S = saturate( N.L );
S = saturate( N.L );
But the threshold is quite low here, so by multiplying this by 4.0, we will get the right threshold [Frazier].
Given this, we can now multiply the specular and diffuse light calculation with S, making the diffuse and specular light to be zero when S is zero!
This gives us a new light equation:
I = A + S(Diffuse*Specular*attenuation)
I = A + S(Diffuse*Specular*attenuation)
This can be optimized( try this on your own, as a lesson ;:) ), by only calculating diffuse and specular if S is above zero, else the diffuse and specular component should be set to zero: if( S > 0) { calculate.. }.
But.. you might be wondering about bump mapping? Can’t the normal from a bump map still look in the direction of the light, even though the surface( physically ) is not? Yes.. it can, and we need to prevent this!
When using normalmapping, we are in tangent space. This allows us to use the Z component of our light vector, since this equals to N.L when the light is in tangent space.
This leads us to the following formula:
S = 4 * LightDirection.z;
S = 4 * LightDirection.z;
Implementing the shader
First of all, we need to declare a light range:
float LightRange;
Then, in our vertex shader, we can calulate the light, putting the attenuation in our lights w-component and putting the light direction vector in L:
// calculate distance to light in world space
float3 L = vec,LightPos – PosWorld;
// calculate distance to light in world space
float3 L = vec,LightPos – PosWorld;
// Transform light to tangent space
Out.Light.xyz = normalize(mul(worldToTangentSpace, L)); // L, light
Out.Light.xyz = normalize(mul(worldToTangentSpace, L)); // L, light
// Add range to the light, attenuation
Out.Light.w = saturate( 1 – dot(L / LightRange, L / LightRange));
Out.Light.w = saturate( 1 – dot(L / LightRange, L / LightRange));
Whats left now is to use the attenuation value in our light equation and apply self-shadowing, which is done in the pixel shader:
float Shadow = saturate(4.0 * LightDir.z);
float Shadow = saturate(4.0 * LightDir.z);
….
….
return 0.2 * Color + Shadow*((Color * D * LightColor + S*LightColor) * (L.w));
As you can see, the point light ain’t very different from the directional lights we used before. 🙂 The self-shadowing value can also be used in our previous examples where a light is being used.
Using the shader
Nothing new here, apart from setting the light range and a light position instead of a light direction:
effect.Parameters["vecLightPos"].SetValue(vLightPosition);
effect.Parameters["LightRange"].SetValue(100.0f);
effect.Parameters["LightColor"].SetValue(vLightColor);
effect.Parameters["vecLightPos"].SetValue(vLightPosition);
effect.Parameters["LightRange"].SetValue(100.0f);
effect.Parameters["LightColor"].SetValue(vLightColor);
NOTE:
You might have noticed that I have not used effect.commitChanges(); in this code. If you are rendering many objects using this shader, you should add this code in the pass.Begin() part so the changed will get affected in the current pass, and not in the next pass. This should be done if you set any shader paramteres inside the pass.
You might have noticed that I have not used effect.commitChanges(); in this code. If you are rendering many objects using this shader, you should add this code in the pass.Begin() part so the changed will get affected in the current pass, and not in the next pass. This should be done if you set any shader paramteres inside the pass.
Short and simple. If you got any questions or feedback, please leave a comment or send me an e-mail! 🙂
Download: Executable + Source
This tutorial seems to have the same problems in common with Tutorial 4 – a custom vertex declaration needs to be implemented, which defines which fields of the vertex data correspond to the TANGENT and BINORMAL properties, and then the content processor for the models should be set to generate tangent frames. Without doing this, the TANGENT input to the vertex shader will be uninitialised / undefined – I\’ve even found that the code as-is sometimes gives different results (presumably depending on whatever values are in the GPU registers at the time).In the pixel shader, I saw that you are calculating the normal withfloat3 N =(2 * (tex2D(NormalMapSampler, Tex)))-0.5;when surely it should be:float3 N =(2 * (tex2D(NormalMapSampler, Tex)))-1.0;And another point – even though you even have a note about it in the tutorial text itself – you need to put effect.CommitChanges() after all the parameter setting for each model, otherwise the parameters being set will not take effect until the next effect.Begin(). On a similar theme, it\’s better to set all the constant shader parameters (e.g. light position, colour, eye position, etc.) outside the effect.Begin()…End() so they are not written each time round the loop – just a little bit of optimisation which may help in more complex scenes.Thanks again for a very informative and eye-opening series of tutorials – I\’m finding them very helpful, and learning lots along the way.
I spotted another error in this demo: in the vertex shader, you are normalizing PosWorld, but it should of course be left in world coordinates, as it represents a position, not a direction. If you then move the point light a little further to the right (so it\’s not inside the meteor!), you see a much more convincing point light effect!
Thanks again for the feedback! I will upload a new and correct example.