Applying texture coordinates to a mesh object

halobear333

Newcomer
Joined
Jan 7, 2004
Messages
14
I've recently come over from opengl programming and have a fair amount of 3d programming experience, but:

How the devil do you apply texture coordinates to a mesh object (ie a box, sphere, teapot, etc). I assume it's similar to how you generate normals: clone the mesh to a FVF that has normals and call computenormals()

But nowhere do I see computetexturecoords :) Surely you don't have to do this manually? Kind of defeats the purpose of having these premade objects.

I should point out that I have .Net 2002, so haven't been able to install the documentation, and can't find the info in that behemoth, msdn.com.
 
Texture coordinates are set in the X files, using DirectX X files have the texture coordinates stored in the mesh data. If you use Mesh.Box(...), there are no texture coordinates included in these generated meshes, you can create a VertexBuffer and make a cube manually with the texture coordinates implemented.

The documentation is installed when you put the SDK on, it doesn't make any difference if you use 2002, the documentation is still installed. Not that this helps much, Managed DirectX 9 is the most poorly documented SDK to ever come from Microsoft, we look forward to better documents coming out soon, or at least in the next version of DirectX.
 
Thanks for the answer.

There's no way of generating texture coordinates for them? What a gyp! For simple geometry I was leaning away from using X files since I like to be able to customize # of slices etc, but ah well.

Actually, around a year ago when I played a bit with managed directx 9 the documentation installed fine. Hasn't since the 2003 release though, and googling has shown that many, many other 2002 users have the same prob. I have a nice chm help file for c++, but the managed help (HxS format) is inaccessable within the program. I'm currently trying to find a program that will read HxS format.
 
To make a simple textured flat square in Visual Basic.Net:
Visual Basic:
        Dim Sq As VertexBuffer
        Sq = New VertexBuffer(GetType(CustomVertex.PositionTextured), 4, Dev, 0, CustomVertex.PositionTextured.Format, Pool.Default)
        Dim V() As CustomVertex.PositionTextured = DirectCast(Sq.Lock(0, LockFlags.None), CustomVertex.PositionTextured())

        V(0) = New CustomVertex.PositionTextured(-0.5, -0.5, 0, 0, 1) 'Bottom Left
        V(1) = New CustomVertex.PositionTextured(-0.5, 0.5, 0, 0, 0) 'Top Left
        V(2) = New CustomVertex.PositionTextured(0.5, -0.5, 0, 1, 1) 'Bottom Right
        V(3) = New CustomVertex.PositionTextured(0.5, 0.5, 0, 1, 0) 'Top Right

        Sq.Unlock()
u = Texture X coordinate(0.0 to 1.0 as a percentage)
v = Texture Y coordinate(0.0 to 1.0 as a percentage)
 
Thanks, though I already know how to use vertex buffers. :)

I'm still mystified as to why the box, sphere etc mesh objects don't have their own texture coordinates, but I guess that's a question only MS can answer. Doesn't make sense to me.

Guess I'll be generating my geometry the hard way (manually).

Again, thanks for the help
 
MESH VERTEX INFORMATION

here is some information as to how to get vertex information from a BOX MESH in C#.

Code:
// Load the mesh from the specified file
mesh = Mesh.Box( dev, 10.0f, 5.0f, 15.0f );

Microsoft.DirectX.Direct3D.VertexElement[] mMeshVertexList = mesh.Declaration;
int mVertexCount = mesh.NumberVertices;
// Texture[] meshTextures  = new Texture[mVertexCount];

for( int i=0; i<mVertexCount; i++ ) {
	Microsoft.DirectX.Direct3D.VertexElement mVertexSingleElement = mMeshVertexList[i];
}

I will do a full example at some stage.
 
Texture on a selfmade mesh

As I use relatively complex objects which I convert from a different file format, I am wondering if this method also works when I want to assign my vertexbuffer to a Mesh. I think it has something to do with attributeranges but I have no clue as to what.

Can anyone help me?
 
Example of converting the Mesh.Box to a textured mesh.

halobear333 said:
I've recently come over from opengl programming and have a fair amount of 3d programming experience, but:

How the devil do you apply texture coordinates to a mesh object (ie a box, sphere, teapot, etc). I assume it's similar to how you generate normals: clone the mesh to a FVF that has normals and call computenormals()

But nowhere do I see computetexturecoords :) Surely you don't have to do this manually? Kind of defeats the purpose of having these premade objects.

I should point out that I have .Net 2002, so haven't been able to install the documentation, and can't find the info in that behemoth, msdn.com.

This function shows an example:

Code:
/// <summary>
/// Creates a textured box from the Mesh.Box method. Texture coordinates = x, and y values shifted by halve length and height.
/// texture y-coordinate is inverted.
/// </summary>
/// <param name="device">The device used for rendering.</param>
/// <param name="width">Width of the box.</param>
/// <param name="height">Height of the box.</param>
/// <param name="depth">Depth of the box.</param>
/// <returns>A box mesh with textured coordinates.</returns>
public static Direct3D.Mesh TexturedBox(Direct3D.Device device, float width, float height, float depth)
{
	GraphicsStream adjacency;
	Mesh box = Mesh.Box(device,width,height,depth,out adjacency);
	Mesh texturedBox = new Mesh(box.NumberFaces,box.NumberVertices,MeshFlags.Managed,CustomVertex.PositionNormalTextured.Format,device);

	// Get the original box's vertex buffer.
	int [] ranks = new int[1];
	ranks[0] = box.NumberVertices;
	System.Array arr = box.VertexBuffer.Lock(0,typeof(CustomVertex.PositionNormal),LockFlags.None,ranks);

	// Set the vertex buffer
	using(VertexBuffer vb = texturedBox.VertexBuffer)
	{
		System.Array data = vb.Lock(0,typeof(CustomVertex.PositionNormalTextured),LockFlags.None,ranks);

		for(int i=0;i<arr.Length;i++)
		{
			Direct3D.CustomVertex.PositionNormal pn = (CustomVertex.PositionNormal)arr.GetValue(i);
			Direct3D.CustomVertex.PositionNormalTextured pnt = (CustomVertex.PositionNormalTextured)data.GetValue(i);
			pnt.X = pn.X;
			pnt.Y = pn.Y;
			pnt.Z = pn.Z;
			pnt.Nx = pn.Nx;
			pnt.Ny = pn.Ny;
			pnt.Nz = pn.Nz;
			pnt.Tu = pnt.X+width/2;
			pnt.Tv = 1.0f-pnt.Y+height/2;
			data.SetValue(pnt,i);
		}

		vb.Unlock();
		box.VertexBuffer.Unlock();
	}

	// Set the index buffer.
	ranks[0] = box.NumberFaces * 3;
	arr = box.LockIndexBuffer(typeof(short),LockFlags.None,ranks);
	texturedBox.IndexBuffer.SetData(arr,0,LockFlags.None);

	return texturedBox;
}

I quess you could do similar things with the other basic Mesh figures. The clue here is how to calculate the texture coordinates. For a cylinder you could let the Tv be a function of the Z coordinates, while the Tu describes where the point is on the circle sorounding the cylinder. Speres could also be delt vith in terms of cos and sin functions.
 
halobear333 said:
I'm still mystified as to why the box, sphere etc mesh objects don't have their own texture coordinates, but I guess that's a question only MS can answer. Doesn't make sense to me.

Because there is no right way to texture something. If they implemented cylindrical mapping, then people would complain they they wanted spherical.
 
DrunkenHyena said:
Because there is no right way to texture something. If they implemented cylindrical mapping, then people would complain they they wanted spherical.

Yes. I see your point. Since my last post in this trhead I have indeed done a spherical textrue maping in the same fashion. It worked ..... almost. All looked fine except the rendering between the last and first "slice separator". Then the rendering interpolated the whole texture, but reversed. I should have expected such a result, but my lack of experience in this field clowded my mind, so here I am, doing a new trial with my own hand crafted logic.

The clue is, I guess, to add an extra "slice separator" for the last 360th degre that completly overlaps the first "slice separator", but with texture coordinate.Tu = 1.0f instead of 0.0f

The textured sphere must be generated from the ground.
I will post the result here when I'm finished.
 
Here is an example of how to create a textured sphere:

(Se the attached bitmap).

Code:
/// <summary>
/// Creates a PositionNormalTextured sphere
/// </summary>
/// <param name="device">The current direct3D drawing device.</param>
/// <param name="radius">The sphere's radius</param>
/// <param name="slices">Number of slices (Horizontal resolution).</param>
/// <param name="stacks">Number of stacs. (Vertical resolution)</param>
/// <returns></returns>
/// <remarks>
/// Number of vertices in the sphere will be (slices+1)*(stacks+1)<br/>
/// Number of faces		:slices*stacks*2
/// Number of Indexes	: Number of faces * 3;
/// </remarks>
public static Direct3D.Mesh TexturedSphere(Device device, float radius, int slices, int stacks)
{
	int numVertices = (slices+1)*(stacks+1);
	int numFaces	= slices*stacks*2;
	int indexCount	= numFaces * 3;// numVertices + (slices-1)*(stacks);

	Mesh mesh = new Mesh(numFaces,numVertices,MeshFlags.Managed,CustomVertex.PositionNormalTextured.Format,device);

	// Get the original sphere's vertex buffer.
	int [] ranks = new int[1];
	ranks[0] = mesh.NumberVertices;
	System.Array arr = mesh.VertexBuffer.Lock(0,typeof(CustomVertex.PositionNormalTextured),LockFlags.None,ranks);

	// Set the vertex buffer
	int vertIndex=0;
	for(int slice=0;slice<=slices;slice++)
	{
		float alphaY = (float)slice/slices*(float)Math.PI*2.0f;  // Angel around Y-axis
		for(int stack=-(stacks+1)/2;stack<=(stacks+1)/2;stack++)
		{
			Direct3D.CustomVertex.PositionNormalTextured pnt = new CustomVertex.PositionNormalTextured();
			float alphaZ = (float)stack/stacks*(float)Math.PI*1.0f;  // Angel around Z-axis.
			pnt.X = (float)(Math.Cos(alphaY)*radius/2.0f)*(float)Math.Cos(alphaZ);
			pnt.Z = (float)(Math.Sin(alphaY)*radius/2.0f)*(float)Math.Cos(alphaZ);
			pnt.Y = (float)(Math.Sin(alphaZ)*radius/2.0f);
			pnt.Tu = pnt.X/radius;
			pnt.Tv = 0.5f - pnt.Y/radius;
			arr.SetValue(pnt,vertIndex++);
		}
	}

	mesh.VertexBuffer.Unlock();
	ranks[0]=indexCount;
	arr = mesh.LockIndexBuffer(typeof(short),LockFlags.None,ranks);
	int i=0;
	short leftVertex = 0;
	short rightVertex = 0;
	for(short x=0;x<slices;x++)
	{
		leftVertex = (short)((stacks+1)*x);
		rightVertex = (short)(leftVertex + stacks + 1);
		for(int y=0;y<stacks;y++)
		{
			arr.SetValue(rightVertex,i++);
			arr.SetValue(leftVertex,i++);
			arr.SetValue((short)(leftVertex+1),i++);
			arr.SetValue(rightVertex,i++);
			arr.SetValue((short)(leftVertex+1),i++);
			arr.SetValue((short)(rightVertex+1),i++);
			leftVertex++;
			rightVertex++;
		}
	}
	mesh.IndexBuffer.SetData(arr,0,LockFlags.None);
	mesh.ComputeNormals();

	return mesh;
}
 

Attachments

You could do what the guy above did :P.

To solve this problem (I was a n00b at the time), I simply made the mesh in parts.. for example if i did a skybox i'd create 6 planes instead of 1 cube..

-The Pentium Guy
 
This is a small change to Reidar Lange's excellent code that works well for texture maps of the Earth and for Pool Balls.

C#:
static public Mesh CreateSphere( Device device, float radius, int slices, int stacks ) {
	int numVertices = (slices+1)*(stacks+1);
	int numFaces	= slices*stacks*2;
	int indexCount	= numFaces*3;

	Mesh mesh = new Mesh(numFaces, numVertices, MeshFlags.Managed, 
		CustomVertex.PositionNormalTextured.Format, device);

	// Get the blank vertex buffer
	int [] ranks = new int[1] { numVertices };
	CustomVertex.PositionNormalTextured[] v = (CustomVertex.PositionNormalTextured[]) mesh.VertexBuffer.Lock(0,
		typeof(CustomVertex.PositionNormalTextured), LockFlags.None, ranks);

	// Set the sphere vertex buffer
	int vertIndex=0;
	for (int slice=0; slice<=slices; slice++) {
		float alphaY = (float)slice/slices*(float)Math.PI*2.0f;  // Angle around Y-axis
		for (int stack=0; stack<=stacks; stack++) {
			if (slice == slices) {
				v[vertIndex] = v[stack];
			}
			else {
				CustomVertex.PositionNormalTextured pnt = new CustomVertex.PositionNormalTextured();
				float alphaZ = ((float)(stack-stacks*0.5f)/stacks)*(float)Math.PI*1.0f;  // Angle around Z-axis.
				pnt.X = (float)(Math.Cos(alphaY)*radius)*(float)Math.Cos(alphaZ);
				pnt.Z = (float)(Math.Sin(alphaY)*radius)*(float)Math.Cos(alphaZ);
				pnt.Y = (float)(Math.Sin(alphaZ)*radius);
				pnt.Nx = pnt.X/radius;
				pnt.Ny = pnt.Y/radius;
				pnt.Nz = pnt.Z/radius;
				pnt.Tv = 0.5f-(float)(Math.Asin(pnt.Y/radius)/Math.PI);
				v.SetValue(pnt, vertIndex);
			}
			v[vertIndex++].Tu = (float)slice/slices;
		}
	}
	mesh.VertexBuffer.Unlock();

	// Set the sphere index buffer
	ranks[0]=indexCount;
	Array arr = mesh.LockIndexBuffer(typeof(short),LockFlags.None,ranks);
	int i=0;
	short leftVertex = 0;
	short rightVertex = 0;
	for(short x=0;x<slices;x++) {
		leftVertex = (short)((stacks+1)*x);
		rightVertex = (short)(leftVertex + stacks + 1);
		for(int y=0;y<stacks;y++) {
			arr.SetValue(rightVertex,i++);
			arr.SetValue(leftVertex,i++);
			arr.SetValue((short)(leftVertex+1),i++);
			arr.SetValue(rightVertex,i++);
			arr.SetValue((short)(leftVertex+1),i++);
			arr.SetValue((short)(rightVertex+1),i++);
			leftVertex++;
			rightVertex++;
		}
	}
	mesh.IndexBuffer.SetData(arr,0,LockFlags.None);

	return mesh;
}

This does a different mapping of the texture coordinates that had a better look for the application I'm working on. I also calculate the normals instead of calling ComputeNormals which had a banding effect when I tried it, probably due to the extra slice vertices on the final slice.
Hope you find this useful.
 
Back
Top