Fair warning: this is a complicated one...
I am working on making my Win98-compatible application run on XP. The application includes a scrolling graph, which must refresh at a high rate. Since timers don't "count down" while their code is executing, this was a always a problem, as the execution time was variable and longer than the refresh period required. As suggested (I think in the MS knowledge base), the problem was solved by using a number of timers, all running at the same time, each one adding a point to the graph. That was the only way to achieve sufficient refresh rate. Running the app on any Win98 system, even after moving from VB5 to VB6, seemed to work the same, regardless of system speed or compiler version. Running the exact same executable on an XP machine produced a much faster refresh rate. So, I made a separate version for XP that only uses 2 or 3, instead of 6, timers in parallel. I also adjusted some of the timer intervals until I got the right refresh rate. That worked great on the [XP Pro] development machine, whether I ran it through VB or compiled it and ran the executable directly. Then, I tried to run it on a test laptop (a significantly slower machine with less RAM running XP Home). To my surprise, the refresh rate was even faster than on the development machine! Based on my experience with the Win98 version, I did not expect the system to make a difference, since both the development and test machines were XP. If anything, I would have predicted a slower refresh rate on the slower system but saw the opposite. Somehow the difference is based on the system, though, but obviously it's not simply system speed. Of course I can tune the timers to match a given system, but I need a more robust and widely applicable solution.
Can anyone offer some insight?
I am working on making my Win98-compatible application run on XP. The application includes a scrolling graph, which must refresh at a high rate. Since timers don't "count down" while their code is executing, this was a always a problem, as the execution time was variable and longer than the refresh period required. As suggested (I think in the MS knowledge base), the problem was solved by using a number of timers, all running at the same time, each one adding a point to the graph. That was the only way to achieve sufficient refresh rate. Running the app on any Win98 system, even after moving from VB5 to VB6, seemed to work the same, regardless of system speed or compiler version. Running the exact same executable on an XP machine produced a much faster refresh rate. So, I made a separate version for XP that only uses 2 or 3, instead of 6, timers in parallel. I also adjusted some of the timer intervals until I got the right refresh rate. That worked great on the [XP Pro] development machine, whether I ran it through VB or compiled it and ran the executable directly. Then, I tried to run it on a test laptop (a significantly slower machine with less RAM running XP Home). To my surprise, the refresh rate was even faster than on the development machine! Based on my experience with the Win98 version, I did not expect the system to make a difference, since both the development and test machines were XP. If anything, I would have predicted a slower refresh rate on the slower system but saw the opposite. Somehow the difference is based on the system, though, but obviously it's not simply system speed. Of course I can tune the timers to match a given system, but I need a more robust and widely applicable solution.
Can anyone offer some insight?