So, you want to learn the magic that turns 3d models and textures to gold?
This tutorial is the first part of a series where I will cover a lot of different shaders – much like my XNA shader tutorial series. However, this tutorial will be an introduction to shader programming. You will learn the basics of the graphics pipeline, what a shader really is, write your first Unity 5 shader, and learn a very basic lighting equation – The Ambient Light.
2001: A shader odyssey – A brief history of shaders
Shaders has been used in ray tracers for the rendering and in the movie scene for a long time, but the story for real-time rendering is different.
Before DirectX 8 and the OpenGL ARB assembly language, GPU’s had a fixed way to transform pixels and vertices, called “The fixed pipeline”. This made it impossible for developers to change how pixels and vertices was transformed and processed after passing them to the GPU, and made games looked quite similar graphics wise.
In 2001, DirectX 8 got support for vertex and pixel shaders as a utility that developers could use to decide how the vertices and pixels should be processed when going through the pipeline, giving them a lot of flexibility.
An assembly language was used to program the shaders, something that made it pretty hard to be a shader developer, and the shader model 1.0 was the only supported version. But this changed once DirectX 9 was released, giving developers the opportunity to develop shaders in a high level language, called High Level Shading Language( HLSL ), replacing the assembly shading language with something that looked more like the C-language. This made shaders much easier to write, read and understand. OpenGL released a similar language called OpenGL Shading Language (GLSL).
DirectX 10 introduced a new shader, the Geometry Shader, and was a part of Shader Model 4.0. DirectX 11 introduced shaders for tessellation and compute shaders for GPGPU.
Taking the red pill
So, the question is.. What is a shader? Well, a shader is simply a set of instructions that will be executed on the graphics processing unit (GPU), performing the specific tasks of your need. This makes it possible for a developer to be in control of all the programmable stages in the graphics pipeline. It also makes you responsible for all of the calculations, and you need to do (almost) everything yourself. It will also enable you to do anything you want.. so are you ready for the red pill and slide through the graphics pipeline?
The Graphics Pipeline?
It might not be obvious to developers who aren’t familiar with low level graphics programming, but all the data you see on a screen is coming from structures of data. Typically, this is a 3D model your artist made in a 3d modeling software, where each entry in the structure has the vertex position, normal direction, tangents, texture coordinates and color. It can also be a structure made procedurally by an awesome algorithm you made and so on. Even sprites, particles and textures in your game world is usually rendered using vertices.
Image from MSDN: https://msdn.microsoft.com/en-us/library/windows/desktop/ff476882(v=vs.85).aspx
This data is sent in to the pipeline in the Input Assembler and then processed all the way through it and end up as a pixel on your monitor. Think of it like your dirty gray looking car that you pass through a car wash, it’s a black box that does stuff to it in different stages like spraying water on it, adding soap, brushing it and drying it, and then when you get out from there, your car has a color, reflection, refraction and everything – it feels like you got a new car.
All the rounded boxes in the image above are the programmable stages in the graphics pipeline. Understanding shaders and being able to be creative with it is like getting root access to the graphics world.
Vertex Shader Stage
This shader is executed once per vertex, and is mostly used to transform the vertex, do per vertex calculations or make calculations for use later down the pipeline.
Hull Shader Stage (Only used for tessellation)
Takes the vertices as input control points and convert it in to control points that make up a patch (a fraction of a surface).
Domain Shader Stage(Only used for tessellation)
This stage calculates a vertex position of a point in the patch created by the Hull Shader.
Geometry Shader Stage
A Geometry Shader is an optional program that takes the primitives (a point, line, triangle, … for example) as an input, and can modify, remove or add geometry to it.
Pixel Shader Stage
The Pixel Shader (also known as fragment shaders in the OpenGL world) is executed once per-pixel, giving color to a pixel. It gets its input from the earlier stages in the pipeline, and is mostly used for calculating surface properties, lighting and post-process effects.
Optimize!
Each of the stages above is usually executed thousands of times pr. frame, and can be a bottleneck in the graphics pipeline. A simple cube made from triangles usually have around 36 vertices. This means that the Vertex Shader will be executed 36 times every frame, and if you aim for 60 fps, this vertex shader will be executed 2160 times per second!
You should optimize these as much as you can.
Developing Shaders in Unity 5
So that was a very brief introduction to shaders and the graphics pipeline. You might now understand that something called a Graphics Pipeline exists, and that you can program parts of it to be able to do whatever you want, but I guess you still have a lot of questions.
There are many ways to develop a shader, like writing in HLSL, GLSL or Cg. Unity however is using a language named ShaderLab, and is used to define a material. The ShaderLab language can be written using Cg/HLSL (Cg and HLSL are very similar) and can also support GLSL inline. We will take a look at this later, but for now, let’s take a closer look to ShaderLab.
ShaderLab
The best way to learn this is to jump directly to Unity. Launch Unity 5 and create a new project.
Cg/HLSL
High Level Shading Language(HLSL) is used to develop shaders using a language similar to C. Just as in C, HLSL gives you tools like declaring variables, functions, data types, testing( if/else/for/do/while and so on) and much more, in order to create a logic for processing vertices and pixels. Below is a table of some keywords that exists in HLSL. This is not all of them, but some of the most important ones.
The Cg language also supports fixed-point number indicated by fixed or fixed4 and so on.
For a complete list (HLSL): https://msdn.microsoft.com/en-us/library/bb509587(v=vs.85).aspx and for Cg: https://en.wikipedia.org/wiki/Cg_(programming_language)
HSLS offers a huge set of functions that can be used to solve complex equations. As we go through this article, we will cover many, but for now, here is a list with just a handful of them. It’s important to learn all of them in order to create high-performance shaders without re-implementing the wheel.
For a complete list: https://msdn.microsoft.com/en-us/library/ff471376.aspx
Creating a shader in Unity 5
1) The first thing we need is a 3d model that we can apply the shader on. In this case, we can simply add a Sphere to the scene, so do that now.
2) Next we need the shader itself. A shader can be added to the scene by simply clicking
Assets->Create->Shader->Standard Surface Shader
3) Now, create a new folder called Shaders and drag this shader into this folder. Also, give it a proper name:
4) We also need a material that will use our shader, so add a material to the scene and give it a proper name.
5) Click the material, and you can see the details in the inspector:
This material is using the Standard shader, as seen in the top of the inspector. All the properties in it are simply properties defined inside of the brand new Standard Shader. This is a great shader and is introducing physically based shading to Unity.
6) Anyways, that’s cool but we want to change it. Click the shader dropdown and find the shader we just made:
Once selected, you can see that our inspector changed and we now have a lot less properties to choose from:
7) Now, drag this material to the sphere we have in our scene and change some of the properties, and you can see that this changes the look of our sphere. It doesn’t change the shader itself, just the input to it. Then the shader will use this input in some magic formula that will produce what you see:
Before we inspect the code, let’s learn how a ShaderLab shader is structured.
You can give the shader a category and a name. The category is used to place the shader in the shader dropdown, and the name is used to identify it.
Next, each shader can have many properties. These can be normal numbers and floats, color data or textures. ShaderLab has a way of defining these so it looks good and user friendly in the Unity inspector.
Next we can have one or more SubShaders. A modern shader requires modern hardware, but we still would like our game to run on other hardware as well. Each SubShader can have a different implementation of the shader, supporting different hardware.
Inside each SubShader, there needs to be a pass, as a shader can be executed in multiple passes. Try to keep the number of passes to a minimum for performance reasons, but a pass will render the geometry object once and then move on to the next pass. A lot of shaders will only need one pass.
Your shader implementation will be inside the pass, surrounded by CGPROGRAM (or GLSLPROGRAM and ENDGLSL of you want to use GLSL).
Then we have the FallBack. If none of the shaders will work, we can fallback to another simple shader like the Diffuse shader.
8) Now, open your shader and delete all the code so you are left with an empty shader file. We do this as the current shader is pretty advances at this stage, and we would like to learn the basics, as well as doing everything ourselves.
Implementing your first shader: Ambient Light
By now you should have a general understanding of a shader. It’s simply a piece of code executed somewhere in the graphics pipeline!
The first shader you will write is a really simple one that just transforms the vertices and calculates the ambient light on the model.
But wait… What is this “Ambient light” thing we are talking about?
Well, ambient light is the basic light in a scene that’s “just there”. If you go into a complete dark room, the ambient light is typically all black, but when walking in a dark room or outside there is almost always some light that makes it possible to see. This light has no direction and can be seen as the color of any faces not hit by any light. The base light of any object in your game world.
Before we can implement the ambient light shader, we need to understand it. The formula for Ambient light can be seen in 1.1 below.
I = Aintensity x Acolor ( 1.1)
I is the final light color on a given pixel, Aintensity is the intensity of the light (usually between 0.0 (0%) and 1.0 (100%)), and Acolor is the color of the ambient light. This color can be a hardcoded value, a parameter or a texture.
The shader can be seen in the following code snippet.
Shader "UnityShaderTutorial/Tutorial1AmbientLight" { Properties { _AmbientLightColor ("Ambient Light Color", Color) = (1,1,1,1) _AmbientLighIntensity("Ambient Light Intensity", Range(0.0, 1.0)) = 1.0 } SubShader { Pass { CGPROGRAM #pragma target 2.0 #pragma vertex vertexShader #pragma fragment fragmentShader fixed4 _AmbientLightColor; float _AmbientLighIntensity; float4 vertexShader(float4 v:POSITION) : SV_POSITION { return mul(UNITY_MATRIX_MVP, v); } fixed4 fragmentShader() : SV_Target { return _AmbientLightColor * _AmbientLighIntensity; } ENDCG } } }
Let’s dive in to the details of this shader.
Properties and Variables
The first thing we do is to set the name of the shader, this can be anything you want, and then we define some properties.
As mentioned earlier, ShaderLab got a special way of defining properties. The general formula is to first type the name of the property, then a display name that will be shown in the Unity Editor, a property type and a default value.
In our shader, we define two properties, one for the Ambient Color and one for the intensity.
Next we go to the shader itself. Since this is a simple shader that will run on most hardware, we set the target to 2.0.
Then we define the name of the function that will be used as the vertex shader. In our case this is the function vertexShader. We do the same for our fragment shader (pixel shader).
We also define our variables that the property is pointing at, these must be the same as the Property name.
The light color is a vector with 4 values, the RGB and A, while the intensity is a float.
This gives us what we need to implement our vertex and pixel shader.
The Vertex Shader
The Vertex Shader is doing one thing only, and that is a matrix calculation. This function takes one input, and that is the vertex position only, and it got one output, the transformed position of the vertex (SV_POSITION) in screen space, the position of the vertex on the screen, stored by the return value of this function. This value is obtained by multiplying the vertex position (currently in local space) with the Model, View and Projection matrices easily obtained by Unity’s’ built-in state variable.
This is done to position the vertices at the correct place on your monitor, based on where the camera is (view) and the projection.
The SV_POSITION is a semantic as is used to pass data between different shader stages in the programmable pipeline. The SV_POSITION is interpreted by the rasterizer stage. Think if this as one of many registers on the GPU you can store values in. This semantic can store a vector value (XYZW), and since it is stored in SV_POSITION, the GPU knows that the intended use for this data is for positioning.
In a later tutorial, we will look at Vertex Shaders that take multiple values as input and passes multiple values down the pipeline.
The Pixel Shader
This is where all the coloring is happening, and our algorithm is implemented. This algorithm doesn’t need any input as we won’t do any advanced lighting calculations yet (we will learn this in the next tutorial). The output is the RGBA value of our pixel color stored in SV_Target (a render target, our final output).
As you can see, this function takes the Ambient Light Color and multiply it with an intensity. The output will be something like this:
NOTE: A built in unity state exists for taking the ambient light from your global game settings.
It looks flat for now as this light doesn’t have any direction to it. However, in the next tutorial, we will build on this to implement diffuse light so stay tuned!
The source can be downloaded from GitHub:
https://github.com/petriw/UnityShaderProgramming/tree/master/1 – Ambient Light
Thanks for making this tutorial. Easy to follow so far. Looking forward to the continuation! 🙂
Thanks for the feedback! I will post the next tutorial today. 🙂
Pingback: October Grab Bag o’ Awesome Things | Matthias Shapiro
Pingback: Huge List of Game Development Resources - App Goodies
Might want to mention a minor Unity bug that has been ignored for a couple of point releases now: when a project only contains a shader and you’re using VStudio, it will show the “opening VS” window, then open Mono instead, and the “opening” progress bar gets stuck there. You have to add a c# file to the project first, then the shader opens in VS as expected.
I’ve observed that in the world today, video games would be the latest popularity with kids of all ages. Often times it may be out of the question to drag your kids away from the activities. If you want the very best of both worlds, there are numerous educational activities for kids. Thanks for your post.
small nit picks: you use “got” a lot of times instead of the correct “has”. example “ShaderLab got a way”. Also, your code example mispelled _AmbientLightIntensity in the fragmentShader. Otherwise great job!
Thankyou for a very clear and well laid out article, this has explained some basic things to me which most other tutorials simply overlook – I’d much rather understand something than blindly accept that ‘this works’.
I look forward to reading more!
Pingback: Mechanic Spotlight: Shaders | Game Revenant
hey champ, can you recommend a book about shaders? kind of from beginners to pro? would be great, thanks 😉