"Direct3D.OutOfVideoMemoryException" : managed directX + c#

GuaW

Newcomer
Joined
Sep 19, 2005
Messages
2
Hi,
Working on a basic 3D engine I actually have a problem using my application with some graphic cards.

The engine is very basic :
Using a Form to select a "game", I create a new Control, then I use it creating my D3Ddevice. I load mesh and textures from .X files.
When the "game" end, I dispose all even the Control and then back to the game selection Form.
On my computer it's working well, but on my "test computer" I got a " Direct3D.OutOfVideoMemoryException " after creating/disposing 5 or 6 "games" but only with specific graphic cards .
I tried to look at the " Device.AvailableTextureMemory " : after each game the value is decreasing and when it's < 0 i got the exception.
If I close and lauch again the program : the Device.AvailableTextureMemory back to it's inital value and i can restart the program to lauch 5 / 6 games again, until the next exception.
I try using only .X files without texture and the problem vanished so i think i must force the program to clean the textures loaded but i'm not able to.
I tried everything (Texture.Dipose(), Device.EvictManagedResources(), GC.Collect(), ...) but it's always the same and always with the same graphic cards.
If I catch the OutOfVideoMemoryException the games lauch but It would be better to prevent the exception.
So I would like to know if someone had an idea, experienced the same problem, or a good way to clean the TextureMemory.

Thanks in advance
 
I found an answer using Pool.Default instead of Pool.Managed when I load textures with TextureLoader.FromStream(device, stream, width, height, mipLevels, usage, format, pool, filter, mipFilter, colorKey);

Does anyone have an idea why Pool.Managed raise a Direct3D.OutOfVideoMemoryException and decreasing Device.AvailableTextureMemory each time on my "test computer" with some graphic cards ? and if I can avoid it without using Pool.Default ?
 
Back
Top