Main Content

Event-Based Sensor Fusion and Tracking with Retrodiction

This example shows how to construct an event-based sensor fusion and tracking model in Simulink®.

Introduction

In this example you create a model for sensor fusion and tracking by simulating radar and vision camera, each running at a different update rate. The sensors and the tracker run on separate electronic control units (ECUs). The tracker runs asynchronously from the sensors whenever there is a batch of detections reported from any sensor.

Model Description

Open the Simulink model using the open_system command.

open_system('EventBasedTrackingModel.slx');

The model consists of five parts.

The Scenario part of the model consists of a Scenario Reader block, which loads the scenario saved in EventBasedFusion.mat. In the scenario, there are 4 vehicles: the ego vehicle, a car in front of it, a passing car, and a car behind the ego car.

The ego car has two sensors: a radar and a vision camera. The radar is simulated using the Driving Radar Data Generator block, running at 25 Hz, or every 40 milliseconds. The camera is simulated using a Vision Detection Generator block, running every 44 milliseconds. Due to the combination of sensor rates, the scenario needs to run at a rate of at least 250 Hz, or every 4 milliseconds. Both sensor models are shown in the Sensor Simulation part of the model.

The Message Delivery System part of the model simulates the asynchronous communication system between the sensors and the tracker. Each sensor outputs a bus of detections that is packed into by a Message Send block and delivered to an Entity Queue. The queues are organized as a first-in-first-out (FIFO) queue. When any queue contains messages, it triggers the tracker subsystem.

The Sensor Fusion and Tracking part of the model consists of a single Triggered Subsystem block. The subsystem is connected to the radar and vision queues and is triggered when they have messages.

open_system('EventBasedTrackingModel/SensorFusionAndTracking');

In the subsystem, the Message Receive blocks read the messages and pass their payload, the detections bus, to the Detection Concatenation block and then to the tracker. With this setup, the tracker is stepped only when there is new sensor data to process.

The final part of the model is the Visualization, where all the scenario, sensor, and tracking data are visualized by a helper block.

Configure the Tracker to Use Retrodiction

For the tracker, you use a Global Nearest Neighbor Tracker block. You modify the tracker to use the retrodiction technique for out-of-sequence measurement (OOSM) handling. When using the tracker in an asynchronous way, the tracker clock may run ahead of the messages that arrive, which renders the messages 'out-of-sequence'. Setting the tracker OOSM handling to retrodiction allows the tracker to process them instead of terminating with an error or neglecting the OOSM. Overall, retrodiction improves the tracker accuracy.

Using retrodiction requires more memory because the tracker must maintain a history of each track. To reduce the memory allocation, you reduce the maximum number of tracks to 20, because there are just a few objects in the scenario. Similarly, you reduce the maximum number of sensors to 2, because only two sensors report to the tracker.

You increase the threshold for assigning detections to tracks from the default 30 to 50 to allow vision and radar detections that may have different offsets on the object to be assigned to the same track. A larger assignment threshold may result in false detections getting assigned to each other and creating false tracks. To reduce the rate of false tracks, you make the confirmation threshold stricter by increasing it from the default 2-out-of-3 to 4-out-of-5 detections.

Run the Model and See the Results

You build and run the model using the command below.

sim('EventBasedTrackingModel.slx');
close_system('EventBasedTrackingModel.slx');

The simulation shows that the tracker tracks the vehicle in front of the ego vehicle after a few steps required for confirmation and maintains the track throughout the scenario. The passing vehicle, in yellow, is tracked only after it enters the radar field of view. The passing vehicle track is dropped after it becomes obscured by the front vehicle. The vehicle behind the ego vehicle is never detected by any sensor and therefore it is never tracked.

Summary

This example showed you how to use an event-based sensor fusion and tracking system. The example showed how to connect sensors with different update rates using an asynchronous tracker and how to trigger the tracker to process sensor data whenever the data becomes available. The tracker uses the retrodiction out-of-sequence measurement handling technique to process sensor data that arrives out of sequence.

See Also

| | | (Sensor Fusion and Tracking Toolbox)

Related Topics