I am going to reply to this, but will do so in the context of Powerbot since I don't know how Merv does it.
Powerbot sends a TICK event to all modules at no more than once every 100ms. So if you need to check something "often", you can do so in response to this event. If you only need to check something "about once every second", you could count off 10 TICK events.
Inside the core, I am in a tight loop doing all the things the core needs to do (network recv/send, reliability layer, various module events). Each bot has its own thread, and to keep the threads from using 100% CPU when there's nothing happening, I call Sleep(5). This is sufficient to bring CPU usage down to 0% even with 20 bots with 10 modules each.
So why can't you count, say, 100 TICK events to get 10 seconds? Because any given thread (or process) is not guaranteed to get processing time, and also because Sleep(5) guarantees that you will NOT get back in for 5ms, but does not guarantee that it will not be longer. In fact under XP, I believe, that the foreground thread runs for 40ms before being interrupted (unless it gives up the slice by calling Sleep/Wait/GetMessage/etc), and a background thread runs for 20ms. So each TICK event might actually take 120ms, making 100 of them be more like 12 seconds.
So, if you need a very accurate 2500ms, for example, then you should (in Powerbot) use the TICK event, check GetTickCount() for yourself based on when the time interval started. You would get an accuracy between 2500 and 2600ms plus any delay caused by other processes. If you needed MUCH more accuracy than that, then you would need to create your own thread and call GetTickCount() very often to see when your interval had passed. However, you would probably then still need to sync up with the bot's thread, which would add 100+ms of sync time into the mix. Therefore it would not be worth it.
FYI: The Powerbot core also has a timer function. You can call SetTimer(t) and you wil get a TIMER event at t (or greater) ms in the future. The timer functionality is only as accurate as the TICK accuracy, but it's a little more convenient when you need lots of timed events.
Bak - Sat Nov 26, 2005 9:39 pm
Post subject:
That's reasonable, looks like that's how MERVBot does it too.
Uint32 time = getTime();
...
if (time - lastTick >= 100)
{
lastTick = time;
imports->talk(makeTick());
} |
Uint32 getTime()
{
return GetTickCount() / 10;
} |
If one wanted more accuracy (but possibly more than one tick per second on some instances) we could do:
if (time - lastTick >= 100)
{
lastTick += 100;
imports->talk(makeTick());
} |
This assumes lastTick is initialized properly.
D1st0rt - Sun Nov 27, 2005 2:22 am
Post subject:
It's like if the bots are in a race. The starting gun is fired, but they probably won't start simultaneously with the gun because of the reaction time required to recognize that it is time to start and then actually start.
Cyan~Fire - Sun Nov 27, 2005 2:13 pm
Post subject:
Uhhh, thanks for the analogy.