LDV Posted September 1, 2004 Posted September 1, 2004 Hi Guys, I wrote a short game in C# Managed DirectX, using the Direct3D interface. In my computer and Graphic Card, the FPS is being controlled by the Screen Refresh Rate ... means 85FPS when using PresentationInterval=default. When i'm using PresentationInterval=immidiate it gets to 300FPS ! On others computers the PresentationInterval - default doest not work !! - and i can't see the graphics ... why is it? - and the Refresh Rate is not being used to synchronize the program. I want it always to work with PresentationInterval - default so i can use the screen refresh rate to update the screen and not to have 300FPS. 10x, LDV Quote
ThePentiumGuy Posted September 2, 2004 Posted September 2, 2004 You could use an FPS-limiter. I forgot about the mecahninsms behind it. But work out something like: every 85/1000 milliseconds: run the game loop Pent Quote My VB.NET Game Programming Tutorial Site (GDI+, Direct3D, Tetris [coming soon], a full RPG.... you name it!) vbprogramming.8k.com My Project (Need VB.NET Programmers) http://workspaces.gotdotnet.com/ResolutionRPG
Loffen Posted September 14, 2004 Posted September 14, 2004 or try: PresentInterval.One This will also limit the FPS to the refresh rate of the monitor / display.... --Loffen Quote Erutangis a si tahw
ThePentiumGuy Posted September 18, 2004 Posted September 18, 2004 Here's some source code I found on my book: Do While Not Gameover If system.Environment.TickCount - LastTick >= 1000 / DesiredFrameRate Then <Other Rendering Code> lastTick = System.Environment.TickCount End If Application.DoEvents Loop Quote My VB.NET Game Programming Tutorial Site (GDI+, Direct3D, Tetris [coming soon], a full RPG.... you name it!) vbprogramming.8k.com My Project (Need VB.NET Programmers) http://workspaces.gotdotnet.com/ResolutionRPG
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.