Jump to content

A timer's resolution is 20 ms, and a value less than 20 ms is effectively 0 ms


Recommended Posts

Posted (edited)

Whenever I copy and paste a timer whose "time to go off" value is 5 ms, the copied timer is set to 20 ms.  This seemed to me a strong hint that 20 ms is the smallest possible delay. Testing this has proven to me, without any doubts, that this is the way it is.  And unfortunately, this also demonstrated that any timer value less that 20 ms is treated as 0 ms, which was quite a surprise to say the least (and to say it politely).  Things may have been different in the past, but not now ☹️.

 

Update 18/4:

Recent testing has confirmed, without any doubts, that the timer RESOLUTION is 20 ms!

 

This means that regardless of the timer's value, it's ALWAYS TRUNCATED to the nearest 20 ms multiple.  Examples: a counter of 50 ms is effectively 40 ms, a counter of 175 ms is effectively 160 ms.  What is also interesting is it seems the 20 ms setting in RL a bit less that 20 ms, say about 19.7 ms (didn't bother to dig really deep into this value, but it's consistently a little less than 20 ms).  Refer to this post below for more info.

Edited by AcidBath
  • Upvote 1
Mitthrawnuruodo
Posted (edited)

I don't think that's very surprising. If the simulation runs at 50 Hz (a reasonable rate), it cannot correctly process events that are less than 20 ms apart since the delay is shorter than the game update cycle.

 

What are you trying to accomplish with these short timers?

Edited by Mitthrawnuruodo
Posted (edited)

I am using these small delays when serially sampling (or better put, polling) arrays of switches that collect object events.  The ME manual has multiple examples of using 5 ms time delays for such uses, that is where I got the idea to use such small values.  BTW, 20 ms will not be a problem for what I need in these situations.

 

Edit:

Also, the surprise was not the 20 ms minimum.  The surprise was that if a timer is set to less than that, it becomes a no-wait timer which could really mess up certain logic constructs.

Edited by AcidBath
Posted
On 2/6/2021 at 1:42 AM, AcidBath said:

I am using these small delays when serially sampling (or better put, polling) arrays of switches that collect object events.  The ME manual has multiple examples of using 5 ms time delays for such uses, that is where I got the idea to use such small values.  BTW, 20 ms will not be a problem for what I need in these situations.

 

Edit:

Also, the surprise was not the 20 ms minimum.  The surprise was that if a timer is set to less than that, it becomes a no-wait timer which could really mess up certain logic constructs.

Be careful about the polling system at high frequency. This will impact the simulation with a time dilation effect that could become prohibitive.

Just test this by comparing internal and external clocks.

 

Posted (edited)

The rate at which a switch array is polled is 1 htz, i.e. once a second all the array elements are polled sequentially.  Each array element polled pauses 20 ms before the next element is polled.  The idea is that counters collect (i.e. count) the polled results, if any, and the minimum rate that a counter can be so pulsed is every 20 ms.  I can see what you are getting at with your above post.  Increasing the wait to 50 ms won't effect the poll rate while relieving the sim engine's execution burden, in term of trading a 50 htz thread continue with a 20 htz one.

 

BTW, when I was testing the timer minimum, I found that a single "mission thread" can lockup the sim if it never uses a wait and runs forever, e.g. two zero wait timers tied together in a ping-pong manner.

Edited by AcidBath
Posted
On 2/5/2021 at 7:42 PM, AcidBath said:

I am using these small delays when serially sampling (or better put, polling) arrays of switches that collect object events.  The ME manual has multiple examples of using 5 ms time delays for such uses, that is where I got the idea to use such small values.  BTW, 20 ms will not be a problem for what I need in these situations.

 

Edit:

Also, the surprise was not the 20 ms minimum.  The surprise was that if a timer is set to less than that, it becomes a no-wait timer which could really mess up certain logic constructs.

 

Interesting find AcidBath, thanks. This limit would affect logic such as that used in the smaller timers of the Damage Display Switch and the Multi-Input Counter (along with some of the examples in the manual).

Posted
On 2/11/2021 at 10:59 PM, AcidBath said:

BTW, when I was testing the timer minimum, I found that a single "mission thread" can lockup the sim if it never uses a wait and runs forever, e.g. two zero wait timers tied together in a ping-pong manner.

You have now reached the limits. With the trials and test that you and Sketch did we have a better understanding of how these counters and timers work and how to influence them.

Precious work. Thanks.

  • 2 months later...
Posted (edited)

Recent testing has confirmed, without any doubt, that the timer RESOLUTION is 20 ms!

 

This means that regardless of the timer's value, it's ALWAYS TRUNCATED to the nearest 20 ms multiple.  Examples: a time of 50 ms is effectively 40 ms, a time of 175 ms is effectively 160 ms.  What is also interesting is it seems the 20 ms setting in RL a bit less that 20 ms, say about 19.7 ms (didn't bother to dig really deep into this value, but it's consistently a little less than 20 ms).

 

<Guess> My gut reaction tells me that their 'mission engine' is a thread that runs roughly at a 50 htz rate, i.e. it starts running again every ~20 ms, and a timer's value is turned into counts of these 50 htz execution frames. If a mission engine execution frame's compute time is longer than 20 ms, the thread scheduler probably drops (i.e. does not run) the frame that should have started during that overrun, and if time is maintained by these execution frames, time might be stretched by that lost frame if this situation is not managed appropriately; this result will lead to 'time dilation'. </Guess>  Note that 20 ms of compute time is a LONG TIME for modern computers, e.g. my i7 10700k rig.

 

BTW, before retiring I was a software engineer working in aerospace, where I spent 35 years developing and maintaining realtime simulations for guidance systems testing in hardware in the loop labs.

Edited by AcidBath
  • Like 1
  • Thanks 1
  • AcidBath changed the title to A timer's resolution is 20 ms, and a value less than 20 ms is effectively 0 ms
Jaegermeister
Posted

Great info. This might be of interest in addressing some issues in the future. I would guess it's not something that can be altered without a major overhaul though. 

Posted

I don’t see the fixed 50 Hz frequency on updates as a problem, but it would be nice if timers were ordered according to precise release time and triggered in that order. 

  • Upvote 1
Posted (edited)

I did the exact same tests with Rise of Flight, using its mission editor.

 

And the RoF Timer MCU behaves EXACTLY as discussed above, even with the same not quite 20 ms RL time (though a copy/paste doesn't 'correct' a 5ms timer).

 

So this begs this question for me.  Has it been that:

 

1.  Timers at one time worked as they should, i.e. a 75 ms timer expired at 75 ms, and BoS (and possibly RoF) have since been changed so now their timers don't.

OR

2. Most everyone uses timers that are 'big enough' so their mission logic works properly regardless.

OR

3. Timers have always been as they are now; mission writers assumed "what you set is what you get" and didn't thoroughly test to find problems with mission logic based on timers expiring with close timing (< 20 ms) intervals. 

 

Heck, I'm a noob when it comes to mission writing, and it was inconsistencies that lead me to uncover this info.  Considering this, I find it astonishing that nobody has ever looked into how timers actually behaved with their expiration times...but hey, that's me, who's trained to use RL military grade software standards.

 

 

Maybe it's best that the past is now a mystery, and looking forward is all that matters now.

 

Edited by AcidBath
Jaegermeister
Posted

I mostly use 1 second or longer timers so it never really mattered. The AI needs at least that long to process most commands in my experience or they randomly get lost. 2 seconds is better to process a command formation MCU before a waypoint. I also never have more than 2 outputs from a trigger timer or they also randomly get lost. 

 

Posted (edited)

Most of the time I also wait 2 seconds after the last event (e.g. spawn, waypoint) before issuing AI commands.  For mission initialization stuff, where I use the number of players to determine how much AI to spawn, I use timers in the .1 to 1 second range to synchronize the set up of the mission elements.  It was when I started creating arrays of pollable switches, designed to catch events from large-ish quantities of AI, that I bumped into problems using tiny timer values.   Each AI or mission object is assigned one or more switch array elements, the number of elements depends on the number of the types of the object's events being monitored.  These arrays are polled once a second, one element at a time; this allows a mass of AI events which occur over a very small time period to be serially sampled at a fixed rate that guarantees that Counter MCUs counting these events don't miss any of them.

Edited by AcidBath

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...