Main Content

Radar Architecture: Test Automation and Requirements Traceability (Part 2)

This example is the second part of a two-part series on how to design and test a radar system in Simulink® based on a set of performance requirements. It discusses testing of the model developed in Part 1 and verification of the initial requirements. It shows how to use Simulink Test™ for setting up test suites to verify requirements linked to the components of the system. The example also explores a scenario when the stated requirements have been revised leading to the changes in the design and tests.

Part 1 of this example starts with a set of performance requirements. It develops an architecture model of a radar system using Simulink System Composer™. This architecture model is employed as a virtual test bed for testing and verifying the radar system designs. Part 1 shows how to use Requirements Toolbox™ to link the requirements to the components of the architecture. It also shows how to implement the individual components of the architecture using Simulink.

Automated Testing

Prior to setting up the tests, load the model constructed in the Part 1 of the example.

open_system('slexRadarArchitectureExample')

Simulink Test Manager is a tool for creating tests suites for the model. To access Test Manager click on Simulink Test in the Apps tab, then navigate to Tests tab and click Simulink Test Manager. To get started with the tests, create a new test file for the model by clicking on New Test File. Then add two separate test suites, one for each requirement. Further configure the test suites by:

  • Adding a description to each test suite to shortly describe what functionality is being tested.

  • Linking the test suite to one or multiple requirements. The tests in the test suite must pass in order for the requirements to be verified.

  • Adding callbacks for setup before and cleanup after the test run. This example requires a global variable in the base workspace in order to aggregate the results of multiple Monte Carlo runs within a single test suite.

Next configure the tests within the test suites. The changes are made only in the System Under Test, Parameter Overrides, Iterations, and Custom Criteria sections.

  • In the System Under Test section, set the Model field to the name of the model, which in this example is slexRadarArchitectureExample.

  • The Parameter Overrides section is used to assign different values to the parameters in the base workspace during a test execution. Use this section to specify the targets parameters for the maximum range test and the range resolution test.

For the maximum range test, specify a single target with 1 m2 radar cross section (RCS) at the range of 6000 m from the radar as stated in R1.

For the range resolution test, specify two targets with different RCS separated in range by 70 m as required by R2.

  • Because of the random noise and the target fluctuation effects, it is possible to verify only the averaged radar system performance collected over multiple test runs. The Iterations section of the test can be used to configure the test to run multiple times to implement Monte Carlo simulations. This example adds a custom script to the Scripted Iterations subsection to set up Monte Carlo. The script performs only ten iterations. To robustly verify the performance of the system more iterations are required.

  • The Custom Criteria section allows you to specify a custom rule that verifies the test results at the end of each iteration. Configure it to run the helperslexRadarArchitectureTestCriteria helper function that processes results of each test iteration and stores them in the detectionResults variable in the base workspace. This function computes the number of detection threshold crossings. If this number is equal to the number of targets in the test, the system passes the test iteration, otherwise the iteration is declared as failed. In the last iteration, helperslexRadarArchitectureTestCriteria computes the total number of passed iterations. The second argument to this helper function is the percentage of the iterations that must pass for the entire test to pass. The maximum range test requires that at least 90% of all iterations pass. Since the range resolution test models two independent targets, it requires that at least 80% of all test iterations are successful.

Open this test suite in Test Manager.

open('slexRadarArchitectureTests.mldatx')

After adding the tests and linking them to the requirements, the status of the requirements in the Requirements Editor indicates that the verification has been added but the tests have not yet been executed.

Now the tests can be launched. After running both test suites, inspect the results of each individual iteration using the Data Inspector. The custom criteria helper function also prints the status of each iteration to the Command Window.

Since both tests passed, Requirements Editor now shows that both requirements have been implemented and verified.

Revised Requirements

It is common that during a design process the initial requirements are revised and changed. This example assumes that the new maximum range requirement is 8000 m and the new range resolution requirement is 35 m. The updated requirements are:

  • R1: The radar must detect a Swerling 1 Case target with a radar cross section (RCS) of 1 m2 at the range of 8000 m with a probability of detection of 0.9 and the probability of false alarm of 1e-6.

  • R2: When returns are detected from two Swerling 1 Case targets separated in range by 35 m, with the same azimuth, the radar must resolve the two targets and generate two unique target reports 80 percent of the time.

Making changes to requirements in Requirements Editor will generate change issues and highlight the Summary status of the corresponding requirement in red. The links to the components that implement the changed requirement and to the tests that verify it are also highlighted. This way it is easy to identify which components of the design and which tests need to be updated in order to address the changes in the requirement and to test them.

To monitor the changes in the requirements or in the implementations of the system components use the requirements Traceability Matrix.

Updated System Parameters

The new maximum range requirement is beyond the current unambiguous range of the system that equals 7494.8 m. To satisfy the new requirement, increase the unambiguous range. This can be accomplished by lowering the PRF. Setting the PRF to 16 kHz results in the unambiguous range of 9368.5 m, which is well beyond the required maximum range of 8000 m.

Since the current radar design transmits unmodulated rectangular pulses, the resolution limit of the system is determined by the pulse width. The current range resolution limit is 60 m. The new requirement of 35 m is almost two times lower. A rectangular pulse which satisfies this requirement would have to be twice as short, reducing the available power at the same range by half. The requirement analysis using the Radar Designer app shows that this system cannot reach the required detection performance at the maximum range of 8000 m. To achieve the required maximum range and range resolution, without increasing the peak transmitted power or the antenna gain, adopt a new waveform with the time-bandwidth product that is larger than 1. Setting the pulse width to 1 μs and the bandwidth to 5 MHz will provide the desired resolution.

Open this design in Radar Designer app.

radarDesigner('RadarDesigner_LFMWaveform.mat')

The Pulse Waveform Analyzer app can be used to select a radar waveform from several alternatives. This example uses the LFM waveform.

pulseWaveformAnalyzer('PulseWaveformAnalyzer_LFMWaveform.mat')

Revised Design

A convenient way to modify the behavior of a component of the system is to add an alternative design by creating a variant. This is done by right clicking on the component and selecting Add Variant Choice. Add a variant to Waveform Generator and add Simulink behavior to it to implement the LFM waveform generation.

Configure the Linear FM block by setting the pulse width to the new value of 1 μs. Set the sweep bandwidth to 5 MHz and the PRF property to the updated PRF value of 16 kHz. Run the model with the LFM waveform.

% Set the model parameters
helperslexRadarArchitectureParameters;

% Update the model parameters to use the LFM waveform
helperslexRadarArchitectureParametersLFM;

simOut = sim('slexRadarArchitectureExample.slx');

data = simOut.logsout{1}.Values.Data;

figure;
plot(range_gates, data(numel(range_gates)+1:end));
xlabel('Range (m)');
ylabel('Power (W)');
title('Signal Processor Output');

grid on;

Updated Tests

Before verifying that the radar system with LFM can satisfy the updated requirements, make corresponding modifications to the tests by updating the targets positions.

  • Set the target range in the maximum range test to 8000 m

  • Change target ranges in the range resolution test so the targets are positioned 35 m from each other

After updating the tests, clear all change issues in Requirements Editor. Click Show Links in the Requirements tab, then select the links and click on Clear All button in Change Information section of the Details panel on the right. Launch the test when the issues are cleared. The new design will pass the updated tests and verify that the system satisfies the updated requirements confirming the predictions made by the Radar Designer app.

Summary

This example is the second part of a two-part series on how to design and test a radar system in Simulink based on a set of performance requirements. It shows how to use Simulink Test to test the model developed in Part 1, how to link the test to the requirements, and how to verify that the requirements are satisfied by running Monte Carlo simulations. The example also illustrates how to trace changes in the requirements to the corresponding components and how to create alternative designs by adding variants to the model. Part 1 of this example starts with the requirements that must be satisfied by the final design. It uses System Composer to develop an architecture model of a radar system that can serve as a virtual test bed. Part 1 also shows how to use Requirements Toolbox to link the requirements to the components and how to implement the individual components of the architecture using Simulink.