Main Content

Automate Testing for Highway Lane Following Controller

This example shows how to automate testing of a lane following controller and the generated code for this component by using Simulink® Test™. In this example, you:

  • Assess the behavior of a lane following controller on different test scenarios with different test requirements.

  • Automate testing of the lane following controller and the generated code for the reference model.

This example uses the lane following controller presented in the Generate Code for Highway Lane Following Controller example.

Introduction

The lane following controller is a fundamental component in highway lane following applications. The lane following controller generates the steering angle and acceleration control commands for an ego vehicle by using lane and vehicle information along with a set speed. For more information about how to design a lane following controller and configure the model for C++ code generation, see the Generate Code for Highway Lane Following Controller example.

This example shows how to automate testing of the lane following controller against multiple scenarios by using Simulink Test. The scenarios are based on system-level requirements. It also shows how you can verify the generated code using software-in-the-loop (SIL) simulation. In this example, you:

  1. Review requirements — The requirements describe system-level test conditions. Use simulation test scenarios to represent these conditions.

  2. Review test bench model — The model contains controls, vehicle dynamics, and metrics to assess functionality. The metric assessments integrate the test bench model with Simulink Test for automated testing.

  3. Disable runtime visualizations — Disable runtime visualizations to reduce the execution time for automated testing.

  4. Automate testing — Configure a test manager to simulate each test scenario, assess success criteria, and report results. Explore the results dynamically in the test manager and export them to a PDF for external reviewers.

  5. Automate testing with generated code — Configure the decision logic and controls components to generate C++ code. Run automated testing on the generated code to verify the behavior.

  6. Automate testing in parallel — Reduce the overall execution time for running the tests by using parallel computing on a multicore computer.

In this example, you enable system-level simulation through integration with the Unreal Engine® from Epic Games®.

if ~ispc
    error(['This example is supported only on Microsoft',char(174),' Windows',char(174),'.'])
end

Review Requirements

To explore the test bench model, load the highway lane following controller project.

openProject("HLFController");

Requirements Toolbox™ enables you to author, analyze, and manage requirements within Simulink. This example contains 12 test scenarios, with high-level testing requirements defined for each scenario. Open the requirement set.

open("HighwayLaneFollowingControllerTestRequirements.slreqx")

Alternatively, you can open the file from the Requirements tab of the Requirements Manager app in Simulink.

Each row in this file specifies the testing requirements of the lane following controller component in textual and graphical formats. The scenarios with the scenario_LF_ prefix enable you to test the lane following controller algorithm without obstruction by other vehicles. The scenarios with the scenario_ACC_ prefix enable you to test adaptive cruise control (ACC) behavior with other vehicles on the road. The scenarios with the scenario_LFACC_ prefix enable you to test lane following and ACC behavior with other vehicles on the road.

  • scenario_LF_01_Straight_RightLane — Straight road scenario with the ego vehicle in the right lane.

  • scenario_LF_02_Straight_LeftLane — Straight road scenario with the ego vehicle in the left lane.

  • scenario_LF_03_Curve_LeftLane — Curved road scenario with the ego vehicle in the left lane.

  • scenario_LF_04_Curve_RightLane — Curved road scenario with the ego vehicle in the right lane.

  • scenario_ACC_01_Straight_TargetDiscriminationTest — Straight road scenario with two target vehicles, one in the ego lane and another one in an adjacent lane. This scenario tests the ability of the ego vehicle to identify the lead vehicle when there is another target vehicle that is traveling adjacent to the lead vehicle with the same speed.

  • scenario_ACC_02_Straight_StopnGo — Straight road scenario with a decelerating lead vehicle in the ego lane.

  • scenario_LFACC_01_Curve_DecelTarget — Curved road scenario with a decelerating lead vehicle in the ego lane.

  • scenario_LFACC_02_Curve_AutoRetarget — Curved road scenario with changing lead vehicles in the ego lane. This scenario tests the ability of the ego vehicle to retarget to a new lead vehicle while driving along a curve.

  • scenario_LFACC_03_Curve_StopnGo — Curved road scenario with a lead vehicle slowing down in the ego lane.

  • scenario_LFACC_04_Curve_CutInOut — Curved road scenario with a fast-moving car in the adjacent lane that cuts into the ego lane, and then cuts out from the ego lane.

  • scenario_LFACC_05_Curve_CutInOut_TooClose — Curved road scenario with a fast-moving car in the adjacent lane that cuts into the ego lane and cuts out from the ego lane aggressively.

  • scenario_LFACC_06_Straight_StopandGoLeadCar — Straight road scenario with a broken down vehicle in the ego lane.

Review Test Bench Model

Open the test bench model.

open_system("HighwayLaneFollowingControllerTestBench")

The test bench model contains these subsystems:

  • Simulation 3D Scenario — Specifies the road, vehicles, and vision detection generator used for simulation.

  • Lane Following Decision Logic — Specifies the lateral and longitudinal decision logic, and provides lane center information and most important object (MIO) related information to the controller.

  • Lane Following Controller — Specifies the path-following controller that generates control commands to steer the ego vehicle.

  • Vehicle Dynamics — Specifies the dynamic model for the ego vehicle.

  • Metrics Assessment — Assesses system-level behavior.

The Simulation 3D Scenario, Lane Following Decision Logic, Lane Following Controller, Vehicle Dynamics, and Metrics Assessment subsystems are based on the subsystems used in the Generate Code for Highway Lane Following Controller example.

In this example, the focus is on automating the simulation runs for this test bench model using Simulink Test for the different test scenarios. The Metrics Assessment subsystem enables integration of system-level metric evaluations with Simulink Test. This subsystem uses Check Static Range (Simulink) blocks for this integration. Open the Metrics Assessment subsystem.

open_system("HighwayLaneFollowingControllerTestBench/Metrics Assessment")

In this example, four metrics are used to assess the lane following system.

  • Verify Lateral Deviation — This block verifies that the lateral deviation from the center line of the lane is within the prescribed thresholds for the corresponding scenario. Define the thresholds when you author the test scenario.

  • Verify In Lane — This block verifies that the ego vehicle is following one of the lanes on the road throughout the simulation.

  • Verify Time gap — This block verifies that the time gap between the ego vehicle and the lead vehicle is more than 0.8 seconds. The time gap between the two vehicles is defined as the ratio of the calculated headway distance to the ego vehicle velocity.

  • Verify No Collision — This block verifies that the ego vehicle does not collide with the lead vehicle at any point during the simulation.

Disable Runtime Visualizations

The system-level test bench model opens an Unreal Engine simulation window for visualizing the scenario. This window is not required when the tests are automated.

Configure the Simulation 3D Scene Configuration block to run the Unreal Engine in headless mode, where the 3D simulation window is disabled.

blk = "HighwayLaneFollowingControllerTestBench/Simulation 3D Scenario/Simulation 3D Scene Configuration";
set_param(blk,EnableWindow="off");

Automate Testing

The Test Manager is configured to automate the testing of the lane following controller component. Open the HighwayLaneFollowingControllerMetricAssessments.mldatx test file in the Test Manager.

sltestmgr
sltest.testmanager.load("HighwayLaneFollowingControllerMetricAssessments.mldatx");

Observe the populated test cases previously authored in this file. These tests are configured to run the model.

Each test case uses the POST-LOAD callback to run the setup script with appropriate inputs. After the simulation of each test case, the Test Manager runs the script from the CLEANUP callback to generate the results plots.

Run and Explore Results for Single Test Scenario

Turn off the update messages about model predictive control objects.

mpcverbosity("off");

Test the system-level model with the scenario_LFACC_03_Curve_StopnGo test scenario from Simulink Test.

testFile = sltest.testmanager.load("HighwayLaneFollowingControllerMetricAssessments.mldatx");
testSuite = getTestSuiteByName(testFile,"Test Scenarios");
testCase = getTestCaseByName(testSuite,"scenario_LFACC_03_Curve_StopnGo");
resultObj = run(testCase);

Generate a report after the simulation.

sltest.testmanager.report(resultObj,"Report.pdf", ...
Title="Highway Lane Following Controller", ...
IncludeMATLABFigures=true, ...
IncludeErrorMessages=true, ...
IncludeTestResults=false, ...
LaunchReport=true);

Examine Report.pdf. Observe that the Test environment section shows the platform on which the test is run and the MATLAB version used for testing. The Summary section shows the outcome of the test and duration of the simulation in seconds. The Results section shows pass or fail results based on the assessment criteria. This section also shows the logged plots from the CLEANUP callback commands.

If you have a license for Simulink Coverage™, you can get coverage results in the generated Report.pdf by enabling coverage settings in the Test Manager file. For more information, see the Coverage Settings section in Specify Test Properties in the Test Manager (Simulink Test). You can use coverage data to find gaps in testing, missing requirements, or unintended functionality.

Run and Explore Results for All Test Scenarios

Simulate the system for all the tests by using the run(testFile) command. Alternatively, you can simulate the system by selecting Play in the Test Manager app.

When the test simulations are complete, you can view the test results in the Results and Artifacts tab of the Test Manager. For each test case, the Check Static Range (Simulink) blocks in the model are associated with the Test Manager. This association enables you to the visualize overall pass or fail results.

You can find the generated report in the current working directory. This report contains a detailed summary of the pass or fail statuses and plots for each test case.

Verify Test Status in Requirements Editor

Open the Requirements Editor and select Display. Then, select Verification Status to see a verification status summary for each requirement. The green and red bars indicate the respective pass or fail status of the simulation results for each test.

Automate Testing with Generated Code

The HighwayLaneFollowingControllerTestBench model enables you to verify the generated code by performing equivalence testing for the Lane Following Decision Logic and Lane Following Controller components in open-loop. To perform equivalence testing of these components, use back-to-back testing. Back-to-back tests compare the results of normal simulations with the generated code results from software-in-the-loop, processor-in-the-loop, or hardware-in-the-loop simulations. For more information, see sltest.testmanager.createTestForComponent (Simulink Test). This example focusses on verifying the Lane Following Controller.

Use these steps to create and run an equivalence test for the Lane Following Controller.

1. Select a test scenario and run the setup script.

helperSLHighwayLaneFollowingControllerSetup(scenarioFcnName="scenario_LFACC_03_Curve_StopnGo");

2. Create a test suite object.

testSuite = getTestSuiteByName(testFile,"LaneFollowingControllerEquivalenceTest");
if isempty(testSuite)
    testSuite = sltest.testmanager.TestSuite(testFile,"LaneFollowingControllerEquivalenceTest");
end

3. Create an equivalence test for the component.

testCase = sltest.testmanager.createTestForComponent("TestFile",testSuite, ...
    "Component","HighwayLaneFollowingControllerTestBench/Lane Following Controller", ...
    TestType="equivalence",Simulation1Mode="Normal", ...
    Simulation2Mode="Software-in-the-Loop (SIL)",UseComponentInputs=false, ...
    HarnessOptions={"LogOutputs",true});

A test harness is created by default in the previous step. Find and open the test harness.

harnessList = sltest.harness.find("HighwayLaneFollowingControllerTestBench/Lane Following Controller");
sltest.harness.open("HighwayLaneFollowingControllerTestBench/Lane Following Controller",harnessList(end).name);

4. Set the tolerance for the equivalence test.

Capture the equivalence criteria.

eq = captureEquivalenceCriteria(testCase);

Set the equivalence criteria tolerance for output signals.

sc = getSignalCriteria(eq);
for i=1:size(sc,2)
    if (strcmp(sc(i).Name,"steering_angle") || strcmp(sc(i).Name,"ego_acceleration"))
        sc(i).AbsTol = sqrt(eps("double"));
    else
        sc(i).Enabled = false;
    end
end

5. Run the equivalence test simulation.

run(testCase);

6. View the test results after the simulation completes. Select the Results and Artifacts tab of the Test Manager or enter this command.

sltest.testmanager.view;

The tab shows pass or fail results based on the assessment criteria. You can use this process to create equivalence tests for other test scenarios as well.

This process has shown you how to create and run an equivalence test programmatically. You can also do this graphically by following the steps explained in the Create and Run a Back-to-Back Test (Simulink Test) example.

The HighwayLaneFollowingControllerTestBench model also enables integrated testing of the Lane Following Decision Logic and Lane Following Controller components with Vehicle Dynamics in closed-loop. Regression testing of these components through SIL verification allows you to identify any issues at the system level. This workflow enables you to verify that the generated code produces expected results that match the system-level requirements throughout the simulation.

Set the Lane Following Decision Logic to run in software-in-the-loop mode.

model = "HighwayLaneFollowingControllerTestBench/Lane Following Decision Logic";
set_param(model,SimulationMode="Software-in-the-loop")

Set the Lane Following Controller to run in software-in-the-loop mode.

model = "HighwayLaneFollowingControllerTestBench/Lane Following Controller";
set_param(model,SimulationMode="Software-in-the-loop")

Use the run(testFile) command to simulate the system for all test scenarios. After the tests are complete, review the plots and results in the generated report. If you have a license for Simulink Coverage, you can also get the code coverage analysis for the generated code in the generated report by enabling coverage settings in the Test Manager file.

You can visualize the coverage results for individual test cases, as well as the aggregated coverage results.

Reenable the MPC update messages.

mpcverbosity("on");

Automate Testing in Parallel

If you have a Parallel Computing Toolbox™ license, then you can configure Test Manager to execute tests in parallel using a parallel pool. To run tests in parallel, save the models after disabling the runtime visualizations using save_system("HighwayLaneFollowingControllerTestBench"). Test Manager uses the default Parallel Computing Toolbox cluster, and executes tests on only the local machine. Running tests in parallel can speed up execution and decrease the amount of time it takes to get test results. For more information on how to configure tests in parallel from the Test Manager, see Run Tests Using Parallel Execution (Simulink Test).

Related Topics