DirectX with GDI+

markbiddlecom

Newcomer
Joined
Jun 30, 2003
Messages
13
Location
Denver, CO
OK, so I've been surfing a while for information on this, and I've come up miserably short:

I'm writing a simple top-down game engine with Direct3D 9, and I was going to just use GDI+ for some of the special effects drawing code, because I know how to work with it and I like the interfaces. Anyway, I found out pretty fast that you can't draw between the BeginScene and EndScene methods, which makes sense, but I was having initial success by drawing the graphics after executing the Present method.

At least, until I went to full screen view. Now, the GDI+ drawing never shows up at all. I've moved the code around, slept on the current thread after drawing, dealt with synchronous and asynchronous code, and I just can't figure out what's going on.

Anyway, I gave up on that to go deal with other issues (AI, scripting, etc.), and stumbled across some methods in the managed API that I thought would do the job for me: Device.getRenderTarget (returns a Direct3D.Surface object), Surface.getGraphics, and Surface.releaseGraphics. (Don't quote me on the names, but I think they're right.)

Well, I figured that I could grab a reference to one of the surfaces, snatch a graphics object from the surface, draw some old fashioned lines, and then release the graphics reference. The only problem, as I mentioned earlier, is that I can't find any documentation whatsoever. Microsoft's docs for getGraphics says "Returns a GDI+ Graphics instance," or something to that effect.

I've placed the code just about everywhere I can think of in relation to the BeginScene/EndScene/Present block, and I can't seem to execute the getGraphics method without getting a vague DirectX exception.

So it is that I found my way to this forum, and I'm practically on my knees here begging for help. It's either this or try to learn about whatever functionality the Direct3D API offers for drawing semitransparent lines and other such things. Or just move to GDI+ for the whole blasted engine...

Thanks in advance,
Mark
 
I think the surfaces also support a GetDC method for using the GDI functions (not GDI+).

But, to help with the errors (since the dx9 docs are mostly incomplete), use dbmon. Here's what to do:
Before you open Visual Studio, go to your DX9 install folder. Then goto bin\DXUtils and run dbmon.exe. It will show a console-like window with some text. Leave this open.

Now open your project in Visual Studio. Make sure you're in Debug mode and run the application. Set a breakpoint on any error lines you have (or just after) or just exit the app. If you now look at your console window (dbmon), you'll see (hopefully) a ton of good info about what's going on. The messages often reference invalid params and constant values in the C++ naming convention (all caps for the constant names instead of the enum names). But, you should be able to figure out what's wrong.

Good luck!
-Ner
 
!

EDIT

After some digging and additional playing around, I've managed to answer my own question.

Basially, getGraphics() seems to be a GDI+ wrapper for the GetDC method. The GetDC method itself is a wrapper for the process of locking a rectangle on the render target and then dealing with a graphics stream.

Now, my knowledge of the Graphics class and GDI+ is limited, but my understanding is that it still works with old-fashioned device contexts under it's shiny new exterior (perhaps at the pixel level), and that you can essentially wrap a Graphics object around any device context, so this all makes sense. I think it does, anyway.

Anyway, in order to use LockRectangle, GetDC, and GetGraphics, you need to create a lockable back buffer, as the original error message alluded to. This can be done by specifying the PresentFlag.LockableBackBuffer flag to the PresentFlag member of the PresentParameters structure.

The Microsoft documentation notes that doing so will cause a performance penalty on some cards. I haven't done any benchmarking myself yet, but I'm doing relatively simple 2D emulation (lol--never thought I'd say "2D emulation") and I'm not too worried about the performance.

Thanks again for your help :D!

-Mark

ORIGINAL MESSAGE

Wow! That's actually really helpful--in that it makes me see how much I've yet to learn :p.

I've got a collection of errors and warnings here. Some of them seem to be unrelated --

2644: Direct3D9: (WARN): Device that was created without D3DCREATE_MULTITHREADED is being used by a thread other than the creation thread.

-- and then I've got this beauty:

2644: Direct3D9: (ERROR) :For D3DPOOL_DEFAULT surfaces, GetDC requires the surface be either a lockable render target, an offscreen plain, or created with D3DUSAGE_DYNAMIC.

This is definetly an error message I can decipher--so thanks for your help!

...aaaaaaaand...

Since I'm here, I'll try picking your brain a bit extra: what are the consequences of creating a device with the MULTITHREADED flag--is there a reason I wouldn't want to do this. As a little extra information, I've got a class in my app called a RenderPipeline to which other objects are attached. The pipeline calls prerender, render, and postrender objects in sequence on its own thread. As the error indicates, that thread is not the same one that creates the device. However, all accesses to the device are performed on the new thread.

Secondly, I'm having some trouble determining how to create a "lockable render target," or even how to specify D3DUSAGE_DYNAMIC for the surface's creation. Is this a flag I have to include when I'm creating the presentation parameters?

Also, it seems as though creating a graphics device just uses the DC afterall--is it therefore true that if I dug around for info on the GetDC method I'd find something more to help me here?

-Mark
 
Last edited:
Whew, someone who types as much as me :)

I'm not sure about the effects of marking textures as multithreaded so I can't offer any help. I've used the Dynamic usage before. For my purposes, I was locking a copy of the original texture and tweaking the bits. Here's the code I used to copy and lock a texture:
C#:
SurfaceDescription desc;
desc = textureOrig.GetLevelDescription(0);
Texture textureCopy = new Texture(dev, desc.Width, desc.Height, 1, 0, desc.Format, Pool.Managed);
Surface dst = textureCopy.GetSurfaceLevel(0);
Surface src = textureOrig.GetSurfaceLevel(0);
SurfaceLoader.FromSurface(dst, src, Filter.None, 0);
desc = textureOrig.GetLevelDescription(0);

// Get the bits for the surface and re-color them
ColorType[] c = (ColorType[])dst.LockRectangle(typeof(ColorType), LockFlags.None, desc.Width * desc.Height);

The ColorType is a struct I defined that I KNOW matches the format of the texture I'm loading. I know because I tweaked that texture after loading to ensure it's in the proper format. If you use the built-in methods to load a texture from a file or resource, it only takes the pixel format as a hint so you can't count on it.

Good Luck!

-Nerseus
 
Code?

Here's some sample code to do what we've been talking about, for anybody it may help:

Visual Basic:
Public Sub drawWithGDIPlus()

   Dim device As Microsoft.DirectX.Direct3D.Device
   Dim verts As VertexBuffer

   ' We'll use these to variables in our GDI+ drawing. 
   Dim surface As Microsoft.DirectX.Direct3D.Surface
   Dim g As System.Drawing.Graphics

   Dim pp As New Microsoft.DirectX.Direct3D.PresentParameters

   ' Set up your normal presentation parameters here 
   ' My experience tells me that this method works both in 
   ' windowed and full screen modes. 
   
   ' Apply you normal flags to the pp.PresentFlag member, and 
   ' then add this flag:
   pp.PresentFlag = pp.PresentFlag And _
     Microsoft.DirectX.Direct3D.PresentFlag.LockableBackBuffer

   ' Note that I used And because I'm assuming you may be using
   ' other flags.  If not, get rid of the the first part of this 
   ' statement.

   ' Now create your device as you normally would, using pp as 
   ' your present parameters. 

   ' Now, begin your scene, as you're accustomed to doing...
   device.BeginScene()

   ' Now perform your drawing code as per normal... 

   ' When you need it, intersperse this code to use GDI+ to draw
   ' over the current contents of the back buffer:

   surface = device.GetRenderTarget(0)
   g = surface.GetGraphics()

   ' Use g to draw to the surface.  You'll need to know the
   ' surface dimensions before-hand. 
   
   ' For example, 
   g.DrawLine( _
    New System.Drawing.Pen(System.Drawing.Color.Black), _
    0, _
    0, _
    1, _
    1 _  
   )

   ' When you're done with the GDI drawing, make sure to
   ' perform the following call:
   surface.ReleaseGraphics()

   ' Then clear up the graphics object.  This isn't 
   ' required, but it can help:
   g.Dispose()
   g = Nothing

   surface.Dispose()
   surface = Nothing

   ' It's important to remember that you can't intermix GDI+ and
   ' DirectX drawing code; doing so will result (in the best case)
   ' in a runtime error.  However, it would appear to me that you
   ' can use GDI+ code at any stage and more than once during
   ' a single scene.

   device.EndScene()

   ' And, of course, make sure to clean up the DirectX objects you
   ' used.
   verts.Dispose()
   device.Dispose()

   verts = Nothing
   device = Nothing

End Sub
 
Last edited:
This is pretty good but eats up cpu, I also tried

PHP:
Private BackSurface As Direct3D.Surface

dev.BeginScene()

Dim window As Bitmap = New Bitmap(rect.Width, rect.Height) 'bitmap to hold graphics
Dim surface As Graphics = Drawing.Graphics.FromImage(window) 'holds graphics till ready to blit

surface.DrawLine( _
    New System.Drawing.Pen(System.Drawing.Color.Black), _
    0, _
    0, _
    1, _
    1 _  
   )



BackSurface = BackSurface.FromBitmap(dev, window, Pool.Default)


dev.StretchRectangle(BackSurface, New Rectangle(0, 0, AForm.ClientRectangle.Width, AForm.ClientRectangle.Height), BackBuffer, New Rectangle(0, 0, AForm.ClientRectangle.Width, AForm.ClientRectangle.Height), TextureFilter.None) 'this is where the surface is copied to the backbuffer
  
dev.EndScene()
dev.Present()


This is also cpu intensive. What I would like to do is use DirectDraw to draw all the circles and squares then some how present that on the direct3d device. Anyone have any idea how to do that?
 
The performance problem here is not CPU. There are two problems:

1. Bus traffic. You are copying the image from the backbuffer to system memory, changing it, and then copying it back. The numbers depend on your display depth and size and your bus speed, but it's not free.

2. Parallelism. Remember that on most graphics cards, the 3d stuff is rendered on the video card. This can happen *while* you're doing your GDI and other stuff. When you lock the buffer, you are going to sit there until the card's done rendering, effectively forcing the GPU and CPU to run serially.

A better way to get the effect you're after (not the best, but better) is to render your GDI stuff into a separate texture and apply that texture to a two-polygon square in your 3d scene. You'll still incur bus traffic sending the texture *to* the card, but not as much, and you won't break parallelism (as much - the card will still have to wait for your texture...)
 
Back
Top