Main Content

Asynchronous Sensor Fusion and Tracking with Retrodiction

This example shows how to construct an asynchronous sensor fusion and tracking model in Simulink®.

Introduction

In this example you create a model for sensor fusion and tracking by simulating radar and vision camera, each running at a different update rate. The sensors and the tracker run on separate electronic control units (ECUs). The tracker runs asynchronously from the sensors at a different update rate.

Model Description

Open the Simulink model using the open_system command.

open_system('AsynchronousTrackingModel.slx');

The model consists of five parts.

The Scenario part of the model consists of a Scenario Reader block, which loads the scenario saved in AsynchronousTrackingScenario.mat. In the scenario, there are 4 vehicles: the ego vehicle, a car in front of it, a passing car, and a car behind the ego car.

The ego car has two sensors: a radar and a vision camera. The radar is simulated using the Driving Radar Data Generator block, running at 25 Hz, or every 40 milliseconds. The camera is simulated using a Vision Detection Generator block, running every 44 milliseconds. Due to the combination of sensor rates, the scenario needs to run at a rate of at least 250 Hz, or every 4 milliseconds. Both sensor models are shown in the Sensor Simulation part of the model.

The Message Delivery System part of the model simulates the asynchronous communication system between the sensors and the tracker. Each sensor outputs a bus of detections that is packed into by a Message Send block and delivered to an Entity Queue. The queues are organized as a last-in-first-out (LIFO) queue with a capacity of 1. The queues store only latest data from sensors. The Entity Transport Delay blocks are used to simulate communication delays in the network. The blocks delay an entity for a period of time received at its second input port. In this example you use the Random Number block to generate mean delay values of 40 and 44 milliseconds for radar and vision sensors respectively.

The Sensor Fusion and Tracking part of the model consists of the Message Receive blocks and a Triggered Subsystem block. The Message Receive blocks read the messages and pass their payload to the subsystem. The subsystem is triggered using an external signal at 20Hz or 50 milliseconds. This means that the tracker is updated every 50 milliseconds. The trigger signal is generated using the Pulse Generator block.

open_system('AsynchronousTrackingModel/SensorFusionAndTracking');

In the subsystem, the Detection Concatenation block concatenates detections from both sensors and passes them to the tracker.

The final part of the model is the Visualization, where all the scenario, sensor, and tracking data are visualized by a helper block.

Configure the Tracker to Use Retrodiction

For the tracker, you use a Global Nearest Neighbor Tracker block. You modify the tracker to use the retrodiction technique for out-of-sequence measurement (OOSM) handling. When using the tracker in an asynchronous way, the tracker clock may run ahead of the messages that arrive, which renders the messages 'out-of-sequence'. Setting the tracker OOSM handling to retrodiction allows the tracker to process them instead of terminating with an error or neglecting the OOSM. Overall, retrodiction improves the tracker accuracy.

Using retrodiction requires more memory because the tracker must maintain a history of each track. To reduce the memory allocation, you reduce the maximum number of tracks to 20, because there are just a few objects in the scenario. Similarly, you reduce the maximum number of sensors to 2, because only two sensors report to the tracker.

You increase the threshold for assigning detections to tracks from the default 30 to 50 to allow vision and radar detections that may have different offsets on the object to be assigned to the same track. A larger assignment threshold may result in false detections getting assigned to each other and creating false tracks. To reduce the rate of false tracks, you make the confirmation threshold stricter by increasing it from the default 2-out-of-3 to 4-out-of-5 detections.

Run the Model and See the Results

You build and run the model using the command below.

sim('AsynchronousTrackingModel.slx');
close_system('AsynchronousTrackingModel.slx');

The simulation shows that the tracker tracks the vehicle in front of the ego vehicle after a few steps required for confirmation and maintains the track throughout the scenario. The passing vehicle, in yellow, is tracked only after it enters the field of view of sensors. The vehicle behind the ego vehicle is never detected by any sensor and therefore it is never tracked.

Summary

This example showed you how to use an asynchronous sensor fusion and tracking system. The example showed how to connect sensors with different update rates using an asynchronous tracker and how to trigger the tracker to process sensor data at a different rate from sensors. The tracker uses the retrodiction out-of-sequence measurement handling technique to process sensor data that arrives out of sequence.

See Also

| | | (Sensor Fusion and Tracking Toolbox)

Related Topics