wainwrightwt Posted July 8, 2005 Posted July 8, 2005 I have a user that has writen a VB App using a timer to control events. They require millisecond accuracy for everything to work correctly. They are currently using a laptop (dell latitude c640) with speedstep set to "maximum performance". They say this has been working fine, but there can be issues if the laptop processor slows down, and therefore is sending fewer clock steps per second. They are planning on upgrading to a dell latitude d505 laptop controlled by the speedstep feature. So, the question is, does the VB timer depend on the clock step? And if so, is there a way to deal with the slowing down of the clock step? Or is there an alternate solution (VB or .Net) that does not depend on clock step? Sorry if any of this sounds confusing. I am a web developer, and don't know much of clock step and the like. Thanks for any help! Quote
Leaders Iceplug Posted July 8, 2005 Leaders Posted July 8, 2005 Well, I would say that if the clock on your computer slowed down, then, yes, because everything depends on the timer and how often you execute stuff... however, I don't see why slowing down is such an issue if you only need millisecond accuracy... unless you are doing quite a few calculations or have other processes running on the computer, this shouldn't even be a concern... unless the clock is REALLY slowing down. What kind of timer are you using that is giving you millisecond accuracy? It's possible that a cycle could have been skipped. P.S.: Please post your question only once. Quote Iceplug, USN One of my coworkers thinks that I believe that drawing bullets is the most efficient way of drawing bullets. Whatever!!! :-(
wainwrightwt Posted July 8, 2005 Author Posted July 8, 2005 Unfortunately I had no involvment in the VB app, so I don't know exactly what's going on, but I would assume the user is using a VB timer control/function. The only thing I could see being a problem is that if for some reason, when the timer initializes, it determins the number of clock steps that are being sent per second and uses that to determine how long 1 millisecond is. Then, if the clock steps were to change, it would through off the timer. Could this at all be possilbe? If not, the I would think the timmer would be accurate. Thanks! PS. Sorry about the multiple posts, I wasn't sure if this would be viewed much in the "Random" section. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.