wyrd
Senior Contributor
I can't seem to get the performance I want out of this thing, and it's baffling me to no end.
Here's the breakdown; Running in full screen, drawing the same sprites 5,000 times on to the screen, at 20 fps.
This is fine and dandy, at least until I minimize and then remaximize the screen. When I try to maximize the game after it's been minimized, it takes about 5-10 seconds for the graphics etc. to re-appear on the screen. This is unacceptable.
I've tried several things before coming to my current implementation. The first one was just auto-checking TestCooperativeLevel before drawing each image. This was fine, but shot my game down to about 5 fps. Ew.
After this, I simply put the recovery routine (RestoreAllSurfaces()) inside a try/catch block. However, the problem I described above cropped up here - but instead of 5-10 seconds, it was more like 20.
The next implementation I tried was having a boolean value (called IsLost) that floated around, and was checked during every draw routine (when SurfaceLostException was caught, it'd set the value IsLost to true to prevent other drawing routines from drawing). Then when the main Display routine was called, it'd check the boolean value and test to see if surfaces needed to be restored. Unfortunately, this was a somewhat ugly "hack" and also gave me problems when a single image was lost in the middle of a frame, giving it a "skipped" look. Not acceptable.
The second to last implementation I tried was handling the Surface.IsLost properties individually. Unfortunately this property seems to be buggy, and I had several issues getting it to work right. Even if it did, I'd assume that the speed of the IsLost property would be about the same as the RestoreAllSurfaces() method.
The final implenetation, which so far seems to be the best, is having the client check to make sure the game surface can be drawn to before actually trying to draw on it. It does this by accessing the game surfaces CanDraw property;
So far so good, right? I mean if you check this once per frame, that's a heck of a lot better then per sprite (5,000 of 'em). This would also allow me to put the error checking inside a single routine, which is the Display() method of my game surface. Upon finding an error, it tried to recover all surfaces;
This would leave the client code looking something like this;
Maybe I'm totally missing something, but based off of this there should only be a single check to TestCooperativeLevel(). Given that, there shouldn't be any problem. Right? But there is. Maximizing a minimized game takes 5-10 seconds to redisplay the graphics or even repond to keyboard input.
If anyone can offer me some help, I'd appreciate it.
Here's the breakdown; Running in full screen, drawing the same sprites 5,000 times on to the screen, at 20 fps.
This is fine and dandy, at least until I minimize and then remaximize the screen. When I try to maximize the game after it's been minimized, it takes about 5-10 seconds for the graphics etc. to re-appear on the screen. This is unacceptable.
I've tried several things before coming to my current implementation. The first one was just auto-checking TestCooperativeLevel before drawing each image. This was fine, but shot my game down to about 5 fps. Ew.
After this, I simply put the recovery routine (RestoreAllSurfaces()) inside a try/catch block. However, the problem I described above cropped up here - but instead of 5-10 seconds, it was more like 20.
The next implementation I tried was having a boolean value (called IsLost) that floated around, and was checked during every draw routine (when SurfaceLostException was caught, it'd set the value IsLost to true to prevent other drawing routines from drawing). Then when the main Display routine was called, it'd check the boolean value and test to see if surfaces needed to be restored. Unfortunately, this was a somewhat ugly "hack" and also gave me problems when a single image was lost in the middle of a frame, giving it a "skipped" look. Not acceptable.
The second to last implementation I tried was handling the Surface.IsLost properties individually. Unfortunately this property seems to be buggy, and I had several issues getting it to work right. Even if it did, I'd assume that the speed of the IsLost property would be about the same as the RestoreAllSurfaces() method.
The final implenetation, which so far seems to be the best, is having the client check to make sure the game surface can be drawn to before actually trying to draw on it. It does this by accessing the game surfaces CanDraw property;
C#:
/// <summary>
/// Gets whether this DDGameSurface can be drawn on.
/// </summary>
public bool CanDraw
{
get {
return _device.TestCooperativeLevel();
}
}
So far so good, right? I mean if you check this once per frame, that's a heck of a lot better then per sprite (5,000 of 'em). This would also allow me to put the error checking inside a single routine, which is the Display() method of my game surface. Upon finding an error, it tried to recover all surfaces;
C#:
/// <summary>
/// Display the current DDGameSurface on to the screen.
/// </summary>
public void Display()
{
// Rectangle for surface and buffer is identical.
Rectangle rect = Rectangle.Empty;
rect.Width = _surface.SurfaceDescription.Width;
rect.Height = _surface.SurfaceDescription.Height;
// Draw buffer to screen.
try {
_surface.Draw(rect, _buffer, rect, DrawFlags.Wait);
}
catch {
// Surface was lost.
_restoreSurfaces();
}
}
This would leave the client code looking something like this;
C#:
// Inside process graphics routine.
if (_gameSurface.CanDraw) {
// Draw sprites.
}
// Inside game loop.
_processSound;
_processGraphics();
// ...
_gameSurface.Display();
Maybe I'm totally missing something, but based off of this there should only be a single check to TestCooperativeLevel(). Given that, there shouldn't be any problem. Right? But there is. Maximizing a minimized game takes 5-10 seconds to redisplay the graphics or even repond to keyboard input.
If anyone can offer me some help, I'd appreciate it.