Loading a Static .X Mesh

  CU_MDX_StaticMesh.zip (115.4 KiB, 3,620 hits)


  CUnitFramework.zip (101.1 KiB, 20,443 hits)

Extract the C-Unit Framework
inside the project (not solution) directory.

Hard coding all of the vertex data will get you no where fast. 3D modeling programs can usually export geometry into many different file formats. These formats can hold different types of information and need to be parsed differently. The format we will use is the DirectX X files format. The .x format is the DirectX native 3D mesh file format.

There are 2 different ways to load .x files. If you don’t want or need any animation, you can load the file the easy way. This is what we’ll do in this tutorial. The other method of loading an .x file saves all the animation and bone hierarchy data. This is more complicated and we’ll go over this in a later tutorial. The file we will load in is a simple model I made in Maya and then exported to the .x format using the Maya .x exporter that ships with the DirectX SDK. You can view .x files individually using the DirectX Viewer utility that comes with the SDK. I have also created my own Maya .x exporter called cvXporter, which you can download here.

DirectX Viewer

To use meshes, we’ll create_ two classes: Mesh and MeshInstance. Mesh, not to be confused with the Mesh class in the Direct3D namespace, takes care of loading and storing a single mesh from an .x file. MeshInstance is in charge of rendering the mesh by means of a reference to a Mesh object. The reason why there are two different classes is to save memory. Loading the same mesh for each instance would be pretty expensive. With two classes, we can load the mesh once and then just have all the instances point to that mesh. This will also come in handy with animated meshes.

namespace CUnit
{
/// Mesh class
public class Mesh : IDisposable
{
D3D.Material[] m_materials;
D3D.Texture[] m_textures;
D3D.Mesh m_mesh;

/// Creates a new static mesh
/// Direct3D Device
/// file name
public Mesh( D3D.Device device, string file ) Toggle
{
file = Utility.GetMediaFile( file );
GraphicsBuffer outputAdjacency = new GraphicsBuffer();
D3D.MaterialList materials = new D3D.MaterialList();
D3D.EffectInstanceList effects = new D3D.EffectInstanceList();
m_mesh = new D3D.Mesh( device, file, D3D.MeshFlags.Managed, outputAdjacency, materials, effects );

// Not using effects
effects.Dispose();

// Add normals if it doesn’t have any
// December SDK Maya exporter doesn’t output any VertexFormat anyways so add position and texture also
if ( ( m_mesh.VertexFormat & D3D.VertexFormats.Normal ) != D3D.VertexFormats.Normal )
{
D3D.Mesh tempMesh = m_mesh.Clone( device, m_mesh.Options.Value, m_mesh.VertexFormat | D3D.VertexFormats.PositionNormal | D3D.VertexFormats.Texture1 );
tempMesh.ComputeNormals();
m_mesh.Dispose();
m_mesh = tempMesh;
}

// Attribute sort the mesh to enhance Mesh.DrawSubset performance
m_mesh.GenerateAdjacency( 0.001f, outputAdjacency );
m_mesh.OptimizeInPlace( Microsoft.DirectX.Direct3D.MeshFlags.OptimizeAttributeSort, outputAdjacency );
outputAdjacency.Dispose();

// Extract the material properties and texture names.
m_textures  = new D3D.Texture[m_mesh.AttributeCount];
m_materials = new D3D.Material[m_mesh.AttributeCount];
for ( int i = 0; i < m_mesh.AttributeCount; i++ )
{
m_materials[i] = materials[i].Material;

// Set the ambient color for the material. Direct3D
// does not do this by default.
m_materials[i].AmbientColor = m_materials[i].DiffuseColor;

// Create the texture.
if ( materials[i].TextureFileName != null && materials[i].TextureFileName.Length > 0 )
{
string texture = System.IO.Path.GetFileName( materials[i].TextureFileName );
texture = Utility.GetMediaFile( texture );
m_textures[i] = new D3D.Texture( device, texture );
}
else
{
m_textures[i] = null;
}
}
}


/// Clean up resources
public void Dispose() Toggle
{
if ( m_mesh != null )
{
m_mesh.Dispose();
m_mesh = null;
}
m_materials = null;
if ( m_textures != null )
{
for ( int i = 0; i < m_textures.Length; i++ )
{
if ( m_textures[i] != null )
{
m_textures[i].Dispose();
m_textures[i] = null;
}
}
m_textures = null;
}
}


/// Gets the source Microsoft.DirectX.Direct3D.Mesh
public D3D.Mesh SourceMesh Toggle
{
get { return m_mesh; }
}


/// Gets the Mesh’s Materials.
public D3D.Material[] Materials Toggle
{
get { return m_materials; }
}


/// Gets the Mesh’s Textures.
public D3D.Texture[] Textures Toggle
{
get { return m_textures; }
}

}
}

The Mesh class is used to load in vertex data from a .x file. This mesh class has it’s own Direct3D.Mesh member variable that is part of the DirectX namespace (a bit confusing, I know). To load in the data from the .x file, we simply call the Direct3D.Mesh constructor. This will load all the vertex data into our Direct3D.Mesh instance.

Different exporters will export different amounts of mesh information. The December SDK Maya exporter, for example, does not export any VertexFormat data, which is used to tell DirectX what data each vertex holds. To add this data, we need to call BaseMesh.Clone. Also, if the mesh does not have any normals, DirectX can calculate the normals for us with BaseMesh.ComputNormals.

Once we’re sure the mesh has normals, we attribute sort the mesh with Mesh.OptimizeInPlace. Each polygon in a D3D.Mesh is assigned an attribute ID. This ID is used to determine what attribute, or material/texture, to use when rendering a polygon. All polygons with the same attribute ID are said to be in the same subset. This information is used when rendering a mesh to minimize the amount of renderstate changes. Polygons of the same mesh however may be scattered throughout the IndexBuffer of the Mesh, which means whenever we render the Mesh, DirectX has to iterate through the entire IndexBuffer to ensure it renders all the polygons in the same subset. When we attribute sort the mesh, all polygons of the same subset are stored consecutively. This means DirectX only has to iterate through a portion of the IndexBuffer, which increases performance a little bit.

Besides the vertex data, we also want the model’s material and texture data as well. This data is returned through the MaterialList parameter of the D3D.Mesh constructor. For now, you can ignore the other parameter of the D3D.Mesh constructor.

The MaterialList class contains the mesh’s material definitions and texture file names. As a result, we need to iterate through the class to both copy the material definitions to our own Material array and load in the mesh’s Textures. Notice that the AmbientColor material property is being set to the same value as the DiffuseColor property. We need to do this because the AmbientColor property does not have a value by default.

A small thing to notice is that the size of the Texture and Material arrays are set to the number of attributes in the D3D.Mesh. Remember, an attribute is simply a unique Texture/Material combination. The lengths of these arrays are set to the number of attributes rather than materials.Count because in earlier SDK’s, the DirectX .X Maya exporter output an extra material, which was the default material that is assigned to any polygons in Maya that are missing material assignments. This will cause an error if the lengths of the arrays are set to materials.Count, so to prevent this, we use the number of attributes, which is the actual number of materials/textures. This appears to be fixed in the December SDK but I’ll leave it in just in case.

namespace CUnit
{    
/// Instance of a mesh.
public class MeshInstance : WorldTransform
{
public Mesh m_mesh = null;
private D3D.BoundingSphere m_boundingSphere;

/// Creates anew MeshInstance
/// Mesh to reference
public MeshInstance( Mesh mesh ) Toggle
{
m_mesh = mesh;

// Compute bounding sphere
using ( D3D.VertexBuffer buffer = m_mesh.SourceMesh.VertexBuffer )
{
GraphicsBuffer graphicsBuffer = buffer.Lock( 0, 0, D3D.LockFlags.None );
m_boundingSphere = D3D.Geometry.ComputeBoundingSphere( graphicsBuffer, m_mesh.SourceMesh.NumberVertices, m_mesh.SourceMesh.VertexFormat );
buffer.Unlock();
}
}


/// Render the mesh.
/// Direct3D device
public void Render( D3D.Device device ) Toggle
{
if ( device == null || m_mesh == null )
{
return;
}
device.Transform.World = Transform;
device.VertexFormat = SourceMesh.VertexFormat;
for ( int i = 0; i < m_mesh.Materials.Length; i++ )
{
// Set the material and texture for this subset.
device.Material = m_mesh.Materials[i];
device.SetTexture( 0, m_mesh.Textures[i] );

// Draw the mesh subset.
SourceMesh.DrawSubset( i );
}
}


/// Clean up resources.
public void Dispose() Toggle
{
// m_mesh is disposed in CUnit.Mesh
m_mesh = null;
}


/// Gets the referenced Microsoft.DirectX.Direct3D.Mesh
public D3D.Mesh SourceMesh Toggle
{
get { return m_mesh.SourceMesh; }
}


/// Gets and sets the bounding radius of the mesh.
public float Radius Toggle
{
get { return m_boundingSphere.Radius; }
set { m_boundingSphere.Radius = Math.Abs( value ); }
}

}
}

The MeshInstance class is in charge of rendering and transforming a mesh. When we instantiate a MeshInstance object, we pass in a CUnit.Mesh instance. When the MeshInstance.Render method is called, the method sets the Material and Texture to those stored in the CUnit.Mesh instance before rendering a mesh subset. Remember, a mesh subset is a group of polygons that share a common material and texture. Meshes are grouped this way so rendering them is more efficient. To render all the subsets of a mesh we need to loop through each subset, set the corresponding material and texture, and call BaseMesh.DrawSubset.

You can ignore the bounding sphere code for now, as we’ll go over it later. But you can probably just guess what it does 🙂

Also note that the MeshInstance class inherits from the WorldTransform class that we wrote in the Transformation tutorial. This allows us to translate, rotate, and scale each individual mesh instance.