Main Content

Introduction to Statistical Radar Models for Object Tracking

Sensor Overview

In a tracking system, sensors are used to generate measurements or detections from targets in an environment. Sensors generally have an aperture by which they intercept the energy that targets either emit or reflect. Sensors primarily use the intercepted energy to obtain information about the state and attributes of targets.

A sensor is an active sensor if the intercepted energy originates from itself, such as a monostatic radar or a monostatic sonar. A sensor is a passive sensor if the intercepted energy originates from an outside source, such as an infrared (IR) sensor, which receives radiated energy from a target.

Other than receiving targets’ energy, the aperture inevitably also collects interfering energy created either by nature (such as background clutter) or by man (such as jamming signal). Therefore, the detection quality of sensors involves many factors such as accuracy, resolution, bias, and false alarms. Also, it is essential to consider the detectability of sensors, which relies on factors such as scanning limits, field of view, and sensor mounting in tracking system design.

This introduction mainly discusses radar (including the fusionRadarSensor and radarEmitter objects), but some of the following descriptions also apply to other types of sensors (including the irSensor, sonarSensor, and sonarEmitter objects).

Radar Detection Mode

Radar uses radio wave signals reflected or emitted from a target to detect the target. Given different transmitter and receiver configurations, a radar can have one of three detection modes: monostatic, bistatic, or electronic support measures (ESM).

Three Radar Detection Modes

For the monostatic detection mode, the transmitter and the receiver are collocated, as shown in figure (a). In this mode, the range measurement R can be expressed as R = RT = RR, where RT and RR are the ranges from the transmitter to the target and from the target to the receiver, respectively. In this mode, the range measurement is R = ct/2, where c is the speed of light and t is the total elapsed time of the signal transmission. In addition to the range measurement, a monostatic sensor can also optionally report range rate, azimuth, and elevation measurements of the target.

For the bistatic detection mode, the transmitter and the receiver are separated by a distance L. As shown in figure (b), the signal is emitted from the transmitter, reflected from the target, and eventually received by the receiver. The bistatic range measurement Rb is defined as Rb = RT + RRL. The radar sensor obtains the bistatic range measurement as Rb = cΔt, where Δt is the time difference between the receiver intercepting the direct signal from the transmitter and intercepting the reflected signal from the target. In addition to the bistatic range measurement, a bistatic radar can optionally report the bistatic range rate, azimuth, and elevation measurements of the target. Since the bistatic range and the two bearing angles (azimuth and elevation) do not correspond to the same position vector, they cannot be combined into a position vector and reported in a Cartesian coordinate system. Without additional information, a bistatic sensor can only report the measurements in a spherical coordinate system.

For the ESM detection mode, the receiver can only intercept a signal reflected from the target or emitted directly from the transmitter, as shown in figure (c). Therefore, the only available measurements are the azimuth and elevation of the target or transmitter. The ESM sensor reports these measurements in a spherical coordinate system.

Mounting Radar on Platform

To interpret the detection generated by the radar, you need to understand how the radar is mounted on the platform. The radar mounting frame (M) origin can be displaced from the platform frame origin (P). This displacement is usually specified by the MountingLocation property of a sensor object, such as fusionRadarSensor. The radar mounting frame can also angularly displaced from the platform frame. You can specify this angle displacement represented by three rotation angles in the z-y-x sequence using the MountingAngles property. Initially, the radar scanning frame (S) is aligned with its mounting frame (M). However, when the radar starts scanning, the radar can scan around the z- and y-axes of the mounting frame. The x-direction of the radar scanning frame is aligned with the current boresight direction of the radar.

Sensor Mounting Frame

Detection Ability and Quality

Sensor Coverage

In most cases, a radar is working in a scanning mode in which the sensor beams sweep back and forth, with width equal to its field of view (FOV), through a space region defined by the radar scanning limit. The FOV is typically 3 decibels (dB) of the radar beam width. The speed of the sweeping is specified by the UpdateRate property of the sensor object. You can obtain the scanning speed of the sensor using its field of view and update rate. For example, if the update rate is 20 Hz and the field of view is 2 degrees, then the radar scanning speed is 40 degrees per second. For more details on radar sensor coverage, see the Scanning Radar Mode Configuration example.

Sensor Coverage

Resolution and Accuracy

Sensor resolution defines the ability of a sensor to distinguish between two targets. In a 3-D space, the resolution bin of a radar is formed by the azimuth boundary, elevation boundary, and range boundary. If two targets fall within the same resolution bin, then the radar cannot distinguish between them and reports them as one target in the detection.

Sensor accuracy can be described by the standard deviation of the measurement error. The accuracy is mainly affected by two factors: the signal to noise ratio (SNR) of the sensor and the detection bias of the sensor. SNR is defined as the ratio of reflected signal power to the noise power in decibels (dB). A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise. A larger SNR results in a smaller measurement error and higher accuracy. For radar, SNR is usually a function of the radar cross section (RCS) of the target. The bias of a sensor is mainly due to imperfect alignment or calibration, and is often assumed to be a constant value. In each radar object, you can specify its bias as a fraction of the sensor resolution bin size using properties such as AzimuthBiasFraction. The larger the bias is, the more errors the detection incorporates.

Detection Statistics

Radar can also make an incorrect assessment of the surveillance region. The probability of false alarm (pFA) represents the probability that the radar reports a detection on a resolution bin even though the resolution bin is not occupied by a target. The probability of detection (PD) represents the probability that the radar reports a detection on a resolution bin if the resolution bin is actually occupied by a target. Therefore, 1 – PD represents the probability that the target is not detected by the radar. PD is mainly a function of the SNR of the target and the pFA of the radar.

When a radar operates in an environment where other undesirable radio frequency (RF) emissions interfere with the waveforms emitted by the radar, the radar can experience a degradation in detection performance in the direction of the interfering signals.

Reported Target Range and Range-Rate

In many cases, a radar has the maximum unambiguous range and range-rate limits. If the distance between a target and the sensor is greater than the maximum unambiguous range, then the sensor wraps the detected range in the range of [0, Rmax], where Rmax is the maximum unambiguous range. For example, assume the target’s range Rt is larger than Rmax, then the reported range of the target is mod(Rt,Rmax), where mod is the remainder after division function in MATLAB. In a radar object, you can disable this limitation by setting the HasMaxUnabmiguousRange property to false.

Measurement and Detection Format

In terms of tracking systems, there are two basic classes of measurements: kinematic and attribute. Kinematic measurements provide the tracking systems with information about the existence and location of the target. Typical kinematic measurements include range, range rate, azimuth, and elevation. Attribute measurements usually contain identification and characteristics of the target, such as shape and reflectivity. The kinematic measurements for radar are described here.

In general, radar can report kinematic measurements in either spherical or Cartesian coordinate frames. For spherical coordinates, the radar can report the azimuth, elevation, range, and range rate measurements. For Cartesian coordinates, the radar can report the 2-D or 3-D position and velocity measurements based on the setup. Each radar detection mode can only output certain types of measurements. The available detection coordinates for each detection mode are:

  • For the monostatic detection mode, detections can be reported in the spherical or Cartesian coordinate frames.

  • For the bistatic detection mode, detections can only be reported in the spherical coordinate frame, and the reported range is the bistatic range to the target.

  • For the ESM detection mode, detections can only be reported in the spherical coordinate frame.

In the Sensor Fusion and Tracking Toolbox, sensor objects output detections in the form of objectDetection objects. An objectDetection object contains these properties:

PropertyDefinition
TimeDetection time
MeasurementObject measurements
Measurement NoiseMeasure noise covariance matrix
SensorIndexUnique ID of the sensor
ObjectClassIDUnique ID for object classification
MeasurementParametersParameters used to interpret the measurement, such as sensor configuration and detection frame information
ObjectAttributesAdditional information about the target, such as target ID and target RCS

Note that the MeasurementParameters property contains essential information used to interpret the measurements, such as sensor pose (position, velocity, and orientation) and coordinate frame information for measurements at the time of detection. For more details, see Measurement Parameters and the Convert Detections to objectDetection Format example.