Normal Mapping Effect Sample
This sample builds on the concepts illustrated in the Custom Model Effect sample, and applies a normal mapping effect to a mesh. The sample also uses a custom content processor to demonstrate two additional concepts: adding and removing per vertex data from a mesh, and reading opaque data.
Sample Overview
The sample demonstrates how to use a custom effect to render a model with a normal map, creating the appearance of a bumpy surface without needing to render additional geometry.
A custom content pipeline processor applies the normal mapping effect to the model during the content build process. The processor also creates additional per-vertex data channels for the binormal and tangent information, which the normal mapping effect uses. Finally, the processor uses opaque data to determine which normal map to use, and then applies it to the material.
To make rendering more efficient at run time, a second custom processor changes the normal map pixel format from an encoded unsigned format to a signed format.
Sample Controls
This sample uses the following keyboard and gamepad controls.
Action | Keyboard control | Gamepad control |
---|---|---|
Rotate the model |
UP ARROW, DOWN ARROW, LEFT ARROW, and RIGHT ARROW or W, S, A, and D |
Right thumb stick |
Zoom | Z and X | Triggers |
Pause or resume light animation. | SPACEBAR | A |
Reset. | R | Right thumb stick press |
Exit. | ESC or ALT+F4 | BACK |
How the Sample Works
The NormalMappingModelProcessor derives from the built-in ModelProcessor, and overrides the ConvertMaterial method to pass all the materials on the model through to the custom NormalMappingMaterialProcessor.
Additionally, the processor performs preprocessing on the vertex channels in the scene. The processor overrides the GenerateTangentFrames property, forcing it to always return true. This causes the base ModelProcessor to generate tangent and binormal information for a mesh. The normal mapping technique depends on the presence of this data. The processor also overrides ProcessVertexChannel to remove vertex channels that the normal mapping vertex shader does not use. This optimization makes the run-time vertex buffers smaller and makes them draw more quickly.
The last functionality that the processor provides is to determine the normal map texture to use in the effect. The value could be hard-coded in the processor—as NormalMap.tga, for example—but this would not be a very flexible approach. The processor would work only in extremely simple cases. Instead, the processor uses opaque data to determine the path to the texture. Opaque data, also known as blind data and custom attributes, is data that can be applied to a mesh in a digital content creation (DCC) tool. 3ds Max, Maya, and the FBX file format support opaque data. The FBX importer that ships with XNA Game Studio reads this data and puts it into an OpaqueDataDictionary, which is a map from names to objects.
For this sample, the artist created the lizard and rock in Maya. Before exporting to FBX, a custom attribute was applied to all meshes in the scene. The custom attribute is named NormalMap, and its value is a string containing the path to the texture, such as lizard_norm.tga, or rock_norm.tga. The processor looks for this attribute in every mesh's OpaqueData property, and adds the texture file that is found to the collection of textures to be processed. This approach is preferable to hard coding the file name in the processor, because it works on any number of meshes in a scene, and it gives control to the artist.
To accomplish this, the function LookUpNormalMapAndAddToTextures is used to recurse through the scene hierarchy. Whenever a MeshContent object is found, the processor looks in that mesh's opaque data to find the normal map.
The processor also provides a parameter that the user can set in the UI to "override" for the opaque data. If a value is specified in the property grid for NormalMapTexture, that value will be used as the normal map for all meshes, overriding what is specified in the opaque data. This is useful when using content-creation tools that do not support opaque data. However, this approach forces every mesh in the entire scene to share one normal map.
The NormalMappingMaterialProcessor, which is used by the NormalMappingModelProcessor, is much simpler. It extends from the built-in MaterialProcessor, and overrides MaterialProcessor.BuildTexture to control the processor that is used to build textures. Normal maps are sent through the custom NormalMapTextureProcessor. All other textures are built as normal.
Finally, the NormalMapTextureProcessor is used to process normal map textures. It takes input textures that contain encoded normals with unsigned RGB values ranging from 0 to 1. The textures are converted to the Graphics.PackedVector.NormalizedByte4 format, a signed format designed for use with normal maps.
Note that the textures and effects are not added to the Game Studio project file. These will be built automatically because they are referenced by the NormalMappingModelProcessor, so there is no need for them to be duplicated in the project itself.
Converting to XNA Game Studio 2.0
The most extensive changes to this sample will be in the content processors. Let us cover the quick, cosmetic changes first. In this version of the sample, we have modified the NormalMappingModelProcessor ContentProcessorAttribute, giving it a DisplayName. This will make it appear more nicely in the property grid.
Next, in XNA Game Studio 2.0, processors can be made to not show up in the UI by using the DesignTimeVisibleAttribute. The NormalMappingMaterialProcessor and NormalMappingTextureProcessor are ideal candidates for invisiblility because it does not make sense to use these processors on their own. They are only helper processors used by the NormalMappingModelProcessor.
The final changes revolve around the use of XNA Game Studio 2.0 processor parameters feature. In XNA Game Studio 2.0, content processors can expose parameters that change the way they build content. Using those parameters, you can tweak the way that each piece of content is built. For example, the Model Processor exposes a scale parameter that can be used to rescale a model without having to edit the source file itself. It is not only built-in processors that have this ability, user-defined processors can do this as well.
In the first version of this sample, the file name of the normal map was taken from the opaque data on the model. This is a very flexible and powerful solution, but not all content creation tools support opaque data. This meant, unfortunately, that they could not create data that would be compatible with this sample.
Now that we have processor parameters, however, the story is a little different! By giving our NormalMappingModelProcessor a parameter of type string, we can give the users of our processor a way to specify the normal map without using opaque data. This is not the ideal solution because it will force all meshes in the scene to share the same normal map; for some users this may be good enough.
Defining a parameter on a processor is easy: simply create a property on the processor with a public getter and setter. We have defined one on NormalMappingModelProcessor called "NormalMapTexture." Additionally, we can use attributes to feed some more information about our parameter to the UI. Specifically, we use System.ComponentModel.DisplayNameAttribute to control the name of the property in the UI, System.ComponentModel.DescriptionAttribute to control the description, and System.ComponentModel.DefaultValueAttribute to control the default value. Note that the default value specified in this attribute is only a hint to the UI, and is not used when building the content.
Once the parameter has been defined, it can be used in the property grid. When the asset is built and the processor's Process method is called, the property's setter will be invoked and will be set to the correct value.
While we are working with processor parameters, let us discuss another change in the NormalMappingModelProcessor. This processor inherits from ModelProcessor, which has been updated in XNA Game Studio 2.0 to have several parameters, including the scale parameter mentioned earlier. One of these in particular is interesting to us: GenerateTangentFrames. Tangent frames are a necessary piece of data to implement normal mapping. In the original version of this sample, we generated this data using MeshHelper.CalculateTangentFrames. Now, the base ModelProcessor can do this for us! To tell it to do so, we will override its GenerateTangentFrames property. Because this data is required for normal mapping and should not be optional, we will have the getter always return true and the setter do nothing. Also, we will use System.ComponentModel.BrowsableAttribute to specify to the UI that this property should not be displayed.
Because the base ModelProcessor is generating tangent frames for us, we no longer need to do so in PreprocessSceneHierarchy. Removing CalculateTangentFrames from PreprocessSceneHierarchy leaves it looking rather bare: its only job is to recursively peruse the scene LookUpNormalMapAndAddToTextures. Instead of doing this, we just make LookUpNormalMapAndAddToTextures a recursive function and remove PreprocessSceneHierarchy entirely. In the process, we update the function LookUpNormalMapAndAddToTextures to use the user's value for NormalMapTexture, if one is specified.