Tile-Based texture problems

snarfblam

Ultimate Contributor
Joined
Jun 10, 2003
Messages
2,097
Location
USA
Since DirectDraw has been deprecated, I am learning how to use Direct3D to make a 2D tile-based game, using the book ".NET Game Programming with DirectX 9.0" (VB .NET Edition).

The problem I have is a row of pixels from the textures seems to wrap around on the right and bottom edges. This only happens when I use a hardware device, not a reference device.
temp.JPG


The textures are 32x32 and the squares that they are rendered in are 32x32.

This happened both with my source and the source included on a CD that came with the book. I tried using different overloads to load the textures. I tried changing the coordinates of the texture to be applied from (0 to 1) to (0 to .999), which helped somewhat, but I am worried that both the problem and that solution may vary depending on hardware. I googled, it scoured the DirectX documentation, and looked through the object browser for and relevant classes and properties, and found nothing that helped.

Here is the code I am using:
Visual Basic:
'To load textures
    Images(i) = TextureLoader.FromFile(Device, _
        IO.Path.Combine(ImagePath, ImageList(i)))
 
'To create polygons
    Const Size As Integer = 32
    'Vertex = (X, Y, Z, tU, tV)
    Vertices(0) = New CustomVertex(X * Size, Y * Size, 1, 0, 1)
    Vertices(1) = New CustomVertex(X * Size + Size, Y * Size, 1, 1, 1)
    Vertices(2) = New CustomVertex(X * Size, Y * Size + Size, 1, 0, 0)
    Vertices(3) = New CustomVertex(X * Size + Size, Y * Size + Size, 1, 1, 0)
 
'To render polygons
    Direct3DDevice.SetTexture(0, SpriteImage)
    Direct3DDevice.SetStreamSource(0, VertBuffer, 0)
    Direct3DDevice.DrawPrimitives(PrimitiveType.TriangleStrip, 0, 2)

Can anyone explain to me why this is occuring or how to prevent it? I've spent hours trying to figure this out. My head hurts.
 
Can't say for sure. Generally if reference doesn't match hardware then your hardware is misbehaving. Is your driver up to date?
Also, have your tried using Sprite objects instead?
 
A'm also building a tile based program which allows users to create their own RPG games with tile sizing from 32x32 to 256x256 tiles and I don't have this problem. I see you're using primitives, instead of drawing primitives am just drawing sprites. It's very easy to use you just tell it to draw on x,y coords. I don't know if you use polygons for any particular reason but why don't u use sprites instead?
Althought I am a noob on DirectX I believe that sprites are the best for 2D drawing...

As you can see sprites work well...
RPG.jpg


Anyway, let me know if you're interested in using sprites. I can post a small example to make your life easier :) (in case you don't know how to use them)
 
Last edited:
I am using primative polygons because that is what was used in the book I have.

BTW, I don't recommend this book. Only after purchasing this book, I discovered that it was full of syntax errors, wrong class names (he used DirectX8 class names from time to time, for example), and a myriad of other code-related problems. The only reason that this book is even usable is because it comes with a CD that includes working code. After reading enough of this book, however, I really have to question the author's level of professionalism, and I've seen nothing but bad reviews online.

I am going to try my program on other computers that have different hardware and see what happens, but I have never seen another program do this on my computer. If it were a hardware issue, I would expect to have observed this behavior before.

The tiles are actually stored in an array of the GameEngine class. (It is a fairly simple game and doesn't need a Map class. If and when I apply what I am learning about Direct3D, if I need a flexible or expandable solution, of course I will use a Map class.) The GameEngine's Draw() method renders the array of tiles.

Questions regarding the sprite class: is the performance of sprites the same as or close to that of polygon-based tiles? Are they easy or similar to use (or does anyone know about a tutorial or have any tips if they are not).

And another question:
Suppose I load two textures like this:
Visual Basic:
Images(0) = TextureLoader.FromFile(Device, "Image1.bmp")
Images(1) = TextureLoader.FromFile(Device, "Image1.bmp")
Will two separate textures actually be loaded (as opposed to returning the same instance of a Texture object the second time)? Anything else would be counter-intuitive, but in the book I am using, a TextureLoader.FromFile is used to load a texture for each individual tile, with only a dozen textures and over a thousand tiles. I modified the code, adding a texture managing class to preload the textures, only calling TextureLoader.FromFile once for each texture and the program seemed to start up faster.
 
IIRC the code would create two instances of the texture rather than reuse the same object - you are going to be better off with your custom texture manager if you are using the same texture several times. The other problem with loading the same texture multiple times is that it will need to be transfered to the video card's ram multiple times (which is usually much smaller than system ram)
From what I have read I get the feeling a sprite isn't much more than a convenient wrapper around a textured quad anyway - performance wise there should be very little if any difference. They do however have a fairly easy interface to use. I think The Pentium Guy has done a tutorial on them (either here or on his web site).
 
I figured out what my issue is. I feel pretty stupid for this one. I had set anti-aliasing to always on in my hardware settings. I turned it off and my game looks all spiffy now. Did I mention that I feel pretty stupid?

Thanks for everyone's advice, though.
 
marble_eater said:
I figured out what my issue is. I feel pretty stupid for this one. I had set anti-aliasing to always on in my hardware settings. I turned it off and my game looks all spiffy now. Did I mention that I feel pretty stupid?

Thanks for everyone's advice, though.
Now I feel stupid, what the hell does Anti Aliasing do? And how do you disable it
 
For most programs on most hardware it is off by default.

When dealing with vector graphics, it smooths the "jaggies", the jagged effect that is a result of the fact that computer graphics are nothing but images composed of squares (rather large ones, too, compared to the resolution of printed media). A good example of jaggies can be seen if you closely inspect a truetype font on a system without cleartype. It looks all squarey. A good example of antialiasing would be cleartype itself.

I used to always leave it on because it made my Unreal and Doom look freakin' awesome. There is a performance hit though, and if your hardware is not up to the task, it will kill your framerate.

How you enable/disable it depends on your hardware. It might come with an easy settings utility. Mine is buried deep within my display properties, but it also came with a system tray utility that makes it a piece of cake.
 
Last edited:
THATS why my heightmap wiggles a bit! I think my hardware's up to the task though, I got a 9800Pro overclocked to XT speeds. I always thought you could enable AA/AF using DirectX, but I could never find the option
 
Back
Top