Documentation

Troubleshoot Unexpected Measured Sample Time Value

Some issue is causing the measured sample time from the model to deviate from the requested sample time in the model.

What This Issue Means

Sometimes the sample time that you measure from your model is not equal to the sample time that you requested. This difference depends on your target computer. Your model sample time is as close to your requested time as the target computer CPU allows.

Some amount of error is common for most computers. The margin of error varies from machine to machine.

Most high-level operating systems, like Microsoft® Windows® or Linux®, occasionally insert extra long intervals to compensate for errors in the timer. The Simulink® Real-Time™ software does not attempt to compensate for timer errors. For this product, close repeatability is more important for most models than exact timing. However, sometimes chips have inherent designs that produce residual jitters that can potentially change your system behavior. For example, some Intel® Pentium chips produce residual jitters on the order of 0.5 microseconds from interrupt to interrupt.

Digital processing does not allow infinite precision in setting the spacing between the timer interrupts. This limitation can cause the divergent sample times.

For the supported target computers, the only timer that can generate interrupts is based on a 1.193-MHz clock. For the Simulink Real-Time system, the timer is set to a fixed number of ticks of this frequency between interrupts. If you request a sample time of 1/10000 seconds, or 100 microseconds, you do not get exactly 100 ticks. Instead, the Simulink Real-Time software calculates that number as:

100 x 10-6 s X 1.193 x 106 ticks/s = 119.3 ticks

The Simulink Real-Time software rounds this number to the nearest whole number, 119 ticks. The actual sample time is then:

119 ticks/(1.193 X 106 ticks/s) = 99.75 X 10-6 s
(99.75 microseconds)

Compared to the requested original sample time of 100 microseconds, this value is 0.25% faster.

Try This Workaround

You can use the calculated value for the number of ticks in the requested sample time to derive the expected measured sample time for your target computer. Assume the following:

• Output board that generates a 50 Hz sine wave (expected signal)

• Sample time of 1/10000

• Measured signal of 50.145 Hz

The difference between the expected and measured signals is 0.145 Hz, which deviates from the expected signal value by 0.29% (0.145 / 50). Compared to the previously calculated value of 0.25%, there is a difference of 0.04% from the expected value.

If you want to refine the measured deviation for your target computer, assume the following:

• Output board that generates a 50 Hz sine wave (expected signal)

• Sample time of 1/10200

• Measured signal of 50.002 Hz:

1/10200 s X 1.193 x 106 ticks/s = 116.96 ticks

Round this number to the nearest whole number of 117 ticks. The resulting frequency is then:

(116.96 ticks/117)(50) = 49.983 Hz

The difference between the expected and measured signal is 0.019, which deviates from the expected signal value by 0.038% (0.019 / 50.002). When the sample time is 1/10000, the deviation is 0.04%.

Related Topics 