Sensor Fusion and Tracking Toolbox™ enables you to fuse data read from an inertial measurement unit (IMU) to estimate orientation and angular velocity:

`ecompass`

–– Fuse accelerometer and magnetometer readings`imufilter`

–– Fuse accelerometer and gyroscope readings`ahrsfilter`

–– Fuse accelerometer, gyroscope, and magnetometer readings

More sensors on an IMU result in a more robust orientation estimation. The sensor data can be cross-validated, and the information the sensors convey is orthogonal.

This tutorial provides an overview of inertial sensor fusion for IMUs in Sensor Fusion and Tracking Toolbox.

To learn how to model inertial sensors and GPS, see Model IMU, GPS, and INS/GPS.
To learn how to generate the ground-truth motion that drives sensor models, see `waypointTrajectory`

and `kinematicTrajectory`

.

You can also fuse IMU readings with GPS readings to estimate pose. See Determine Pose Using Inertial Sensors and GPS for an overview.

This example shows how to use 6-axis and 9-axis fusion algorithms to compute orientation. Sensor Fusion and Tracking Toolbox™ includes several algorithms to compute orientation from inertial measurement units (IMUs) and magnetic-angular rate-gravity (MARG) units. This example covers the basics of orientation and how to use these algorithms.

**Orientation**

An object's orientation describes its rotation relative to some coordinate system, sometimes called a parent coordinate system, in three dimensions.

Sensor Fusion and Tracking Toolbox uses North-East-Down (NED) as a fixed, parent coordinate system. NED is sometimes referred to as the global coordinate system or reference frame. In the NED reference frame, the X-axis points north, the Y-axis points east, and the Z-axis points downward. The X-Y plane of NED is considered to be the local tangent plane of the Earth. Depending on the algorithm, north may be either magnetic north or true north. The algorithms in this example use magnetic north.

An object can be thought of as having its own coordinate system, often called the local or child coordinate system. This child coordinate system rotates with the object relative to the parent coordinate system. If there is no translation, the origins of both coordinate systems overlap.

The orientation quantity computed in Sensor Fusion and Tracking Toolbox is a rotation that takes quantities from the parent reference frame to the child reference frame. The rotation is represented by a quaternion or rotation matrix.

**Types of Sensors**

For orientation estimation, three types of sensors are commonly used: accelerometers, gyroscopes and magnetometers. Accelerometers measure proper acceleration. Gyroscopes measure angular velocity. Magnetometers measure the local magnetic field. Different algorithms are used to fuse different combinations of sensors to estimate orientation.

**Sensor Data**

Through most of this example, the same set of sensor data is used. Accelerometer, gyroscope, and magnetometer sensor data was recorded while a device rotated around three different axes: first around its local Y-axis, then around its Z-axis, and finally around its X-axis. The device's X-axis was generally pointed southward for the duration of the experiment.

```
ld = load('rpy_9axis.mat');
acc = ld.sensorData.Acceleration;
gyro = ld.sensorData.AngularVelocity;
mag = ld.sensorData.MagneticField;
viewer = HelperOrientationViewer;
```

**Accelerometer-Magnetometer Fusion**

The `ecompass`

function fuses accelerometer and magnetometer data. This is a memoryless algorithm that requires no parameter tuning, but the algorithm is highly susceptible to sensor noise.

qe = ecompass(acc, mag); for ii=1:size(acc,1) viewer(qe(ii)); pause(0.01); end

Note that the `ecompass`

algorithm correctly finds the location of north. However, because the function is memoryless, the estimated motion is not smooth. The estimate is dramatically affected by the noise in the accelerometer and magnetometer. Some of the techniques presented in the Lowpass Filter Orientation Using Quaternion SLERP could be used to smooth the motion.

**Accelerometer-Gyroscope Fusion**

The `imufilter`

System object™ fuses accelerometer and gyroscope data using an internal error-state Kalman filter. The filter is capable of removing the gyroscope's bias noise, which drifts over time.

ifilt = imufilter('SampleRate', ld.Fs); for ii=1:size(acc,1) qimu = ifilt(acc(ii,:), gyro(ii,:)); viewer(qimu); pause(0.01); end

Although the `imufilter`

algorithm produces a significantly smoother estimate of the motion, compared to the `ecompass`

, it does not correctly estimate the direction of north. The `imufilter`

does not process magnetometer data, so it simply assumes the device's X-axis is initially pointing northward. The motion estimate given by `imufilter`

is relative to the initial estimated orientation.

**Accelerometer-Gyroscope-Magnetometer Fusion**

An attitude and heading reference system (AHRS) consists of a 9-axis system that uses an accelerometer, gyroscope, and magnetometer to compute orientation. The `ahrsfilter`

System object combines the best of the previous algorithms to produce a smoothly changing estimate of the device orientation, while correctly estimating the direction of north. This algorithm also uses an error-state Kalman filter. In addition to gyroscope bias removal, the `ahrsfilter`

has some ability to detect and reject mild magnetic jamming.

ifilt = ahrsfilter('SampleRate', ld.Fs); for ii=1:size(acc,1) qahrs = ifilt(acc(ii,:), gyro(ii,:), mag(ii,:)); viewer(qahrs); pause(0.01); end

**Tuning Filter Parameters**

Tuning the parameters of the `ahrsfilter`

and `imufilter`

to match specific hardware sensors can improve performance. The environment of the sensor is also important to take into account. The `imufilter`

parameters are a subset of the `ahrsfilter`

parameters. The `AccelerometerNoise`

, `GyroscopeNoise`

, `MagnetometerNoise`

, and `GyroscopeDriftNoise`

are measurement noises. The sensors' datasheets help determine those values.

The `LinearAccelerationNoise`

and `LinearAccelerationDecayFactor`

govern the filter's response to linear (translational) acceleration. Shaking a device is a simple example of adding linear acceleration.

Consider how an `imufilter`

with a `LinearAccelerationNoise`

of 9e-3 responds to a shaking trajectory, compared to one with a `LinearAccelerationNoise`

of 9e-4 .

ld = load('shakingDevice.mat'); accel = ld.sensorData.Acceleration; gyro = ld.sensorData.AngularVelocity; viewer = HelperOrientationViewer; highVarFilt = imufilter('SampleRate', ld.Fs, ... 'LinearAccelerationNoise', 0.009); qHighLANoise = highVarFilt(accel, gyro); lowVarFilt = imufilter('SampleRate', ld.Fs, ... 'LinearAccelerationNoise', 0.0009); qLowLANoise = lowVarFilt(accel, gyro);

One way to see the effect of the `LinearAccelerationNoise`

is to look at the output gravity vector. The gravity vector is simply the third column of the orientation rotation matrix.

rmatHigh = rotmat(qHighLANoise, 'frame'); rmatLow = rotmat(qLowLANoise, 'frame'); gravDistHigh = sqrt(sum( (rmatHigh(:,3,:) - [0;0;1]).^2, 1)); gravDistLow = sqrt(sum( (rmatLow(:,3,:) - [0;0;1]).^2, 1)); figure; plot([squeeze(gravDistHigh), squeeze(gravDistLow)]); title('Euclidean Distance to Gravity'); legend('LinearAccelerationNoise = 0.009', ... 'LinearAccelerationNoise = 0.0009');

The `lowVarFilt`

has a low `LinearAccelerationNoise`

, so it expects to be in an environment with low linear acceleration. Therefore, it is more susceptible to linear acceleration, as illustrated by the large variations earlier in the plot. However, because it expects to be in an environment with a low linear acceleration, higher trust is placed in the accelerometer signal. As such, the orientation estimate converges quickly back to vertical once the shaking has ended. The converse is true for `highVarFilt`

. The filter is less affected by shaking, but the orientation estimate takes longer to converge to vertical when the shaking has stopped.

The `MagneticDisturbanceNoise`

property enables modeling magnetic disturbances (non-geomagnetic noise sources) in much the same way `LinearAccelerationNoise`

models linear acceleration.

The two decay factor properties (`MagneticDisturbanceDecayFactor`

and `LinearAccelerationDecayFactor`

) model the rate of variation of the noises. For slowly varying noise sources, set these parameters to a value closer to 1. For quickly varying, uncorrelated noises, set these parameters closer to 0. A lower `LinearAccelerationDecayFactor`

enables the orientation estimate to find "down" more quickly. A lower `MagneticDisturbanceDecayFactor`

enables the orientation estimate to find north more quickly.

Very large, short magnetic disturbances are rejected almost entirely by the `ahrsfilter`

. Consider a pulse of [0 250 0] uT applied while recording from a stationary sensor. Ideally, there should be no change in orientation estimate.

ld = load('magJamming.mat'); hpulse = ahrsfilter('SampleRate', ld.Fs); len = 1:10000; qpulse = hpulse(ld.sensorData.Acceleration(len,:), ... ld.sensorData.AngularVelocity(len,:), ... ld.sensorData.MagneticField(len,:)); figure; timevec = 0:ld.Fs:(ld.Fs*numel(qpulse) - 1); plot( timevec, eulerd(qpulse, 'ZYX', 'frame') ); title(['Stationary Trajectory Orientation Euler Angles' newline ... 'Magnetic Jamming Response']); legend('Z-rotation', 'Y-rotation', 'X-rotation'); ylabel('Degrees'); xlabel('Seconds');

Note that the filter almost totally rejects this magnetic pulse as interference. Any magnetic field strength greater than four times the `ExpectedMagneticFieldStrength`

is considered a jamming source and the magnetometer signal is ignored for those samples.

**Conclusion**

The algorithms presented here, when properly tuned, enable estimation of orientation and are robust against environmental noise sources. It is important to consider the situations in which the sensors are used and tune the filters accordingly.

`ahrsfilter`

| `ecompass`

| `imuSensor`

| `imufilter`

| `insfilter`