Main Content

Artifact Tracing

Units in the Dashboard

A unit is a functional entity in your software architecture that you can execute and test independently or as part of larger system tests. Software development standards, such as ISO 26262-6, define objectives for unit testing. Unit tests typically must cover each of the requirements for the unit and must demonstrate traceability between the requirements, the test cases, and the unit. Unit tests must also meet certain coverage objectives for the unit, such as modified condition/decision coverage (MC/DC).

You can label models as units in the dashboard. If you do not specify the models that are considered units, then the dashboard considers a model to be a unit if it does not reference other models.

In the Dashboard window, in the Project panel, the unit dashboard icon indicates a unit. If a unit is referenced by a component, it appears under the component in the Project panel. If a unit references one or more other models, those models are part of the unit. The referenced models appear in the Design folder under the unit and contribute to the metric results for the unit.

Project panel showing a component model, Component1, that contains two unit models: Unit1 and Unit2. The panel also shows a component model, Component2, that contains a component model, Component3.

To specify which models are units, label them in your project and configure the dashboard to recognize the label, as shown in Specify Models as Components and Units.

Components in the Dashboard

A component is an entity that integrates multiple testable units together. For example:

  • A model that references multiple unit models could be a component model.

  • A System Composer™ architecture model could be a component. Supported architectures include System Composer architecture models, System Composer software architecture models, and AUTOSAR architectures.

  • A component could also integrate other components.

The dashboard organizes components and units under the components that reference them in the Project panel.

If you do not specify the models that are considered components, then the dashboard considers a model to be a component if it references one or more other models.

In the Dashboard window, in the Project panel, the component icon indicates a component. To see the units under a component, expand the component node by clicking the arrow next to the component icon.

To specify the models that are considered components, label them in your project and configure the dashboard to recognize the label, as shown in Specify Models as Components and Units.

Specify Models as Components and Units

You can control which models appear as units and components by labeling them in your project and configuring the dashboard to recognize the labels.

  1. Open a project. To open the dashboard example project, at the command line, enter dashboardCCProjectStart. This example project already has component and unit models configured.

  2. In MATLAB®, at the bottom left of the Project window, right-click in the Labels pane and click Create New Category. Type a name for the category that will contain your testing architecture labels, for example, Testing Interface and then click Create.

  3. Create a label for the units. On the Labels pane, right-click the category that you created and click Create New Label. Type the label name Software Unit and click OK.

  4. Create another label for component models and name the label Software Component.

    Project window showing the labels pane in the bottom left corner. The Testing Interface category is expanded and the labels Software Component and Software Unit are under the category.

    The unit and component labels appear under the category in the Labels pane.

  5. Label the models in the project as components and units. In the project pane, right-click a model and click Add label. In the dialog box, select the label and click OK. For this example, apply these labels:

    • db_Controller — Software Component

    • db_ControlMode — Software Unit

    • db_DriverSwRequest — Software Unit

    • db_LightControl — Software Unit

    • db_TargetSpeedThrottle — Software Unit

  6. Open the Dashboard window by using one of these approaches:

    • On the Project tab, in the Tools section, click Model Testing Dashboard.

    • On the Project tab, in the Tools section, click Model Design Dashboard.

  7. In the Dashboard tab, click Options.

  8. In the Project Options dialog box, in the Classification section, specify the category and labels that you created for the components and units. For the component interface, set Category to Testing Interface and Label to Software Component. For the unit interface, set Category to Testing Interface and Label to Software Unit.

    Project Options dialog box showing categories and labels specified for component and unit interfaces

  9. Click Apply. The dashboard updates the traceability information in the Project panel and organizes the models under the component models that reference them. If a model is not referenced by a component, it appears at the top level with the components.

To open a dashboard for a unit or component, click the name of the unit or component in the Project panel. The dashboard shows the metric results for the unit or component you select.

Trace Artifacts to Units and Components

To determine which artifacts are in the scope of a unit or component, the dashboard analyzes the traceability links between the artifacts, software unit models, and component models in the project. The Project panel lists the units, organized by the components that reference them.

Project panel showing units under a component

When you select a unit or component in the Project panel, the Artifacts panel shows the artifacts that trace to the selected unit or component. Traced artifacts include:

  • Functional Requirements

  • Design Artifacts

  • Tests

  • Test Results

Artifacts panel showing traced artifacts for a unit

To see the traceability path that the dashboard found from an artifact to its unit or component, right-click the artifact and click View trace to dashboard. A trace view opens in a new tab in the dashboard. The trace view shows the connections and intermediate artifacts that the dashboard traced from the unit or component to the artifact. To see the type of traceability that connects two artifacts, place your cursor over the arrow that connects the artifacts. The traceability relationship is either one artifact containing the other or one artifact tracing to the other. For example, for the unit db_DriverSwRequest, expand Functional Requirements > Upstream > db_SystemReqs.slreqx. Right-click the requirement for Target speed increment and click View trace to dashboard. The trace view shows that the unit db_DriverSwRequest traces to the implemented functional requirement Output request mode, which traces to the upstream functional requirement Target speed increment.

Dashboard trace view for a functional requirement

In the Artifacts panel, the folder Trace Issues contains unexpected requirement links, requirements links which are broken or not supported by the dashboard, and artifacts that the dashboard cannot trace to a unit or component. To help identify the type of tracing issue, the folder Trace Issues contains subfolders for Unexpected Implementation Links, Unresolved and Unsupported Links, Untraced Tests, and Untraced Results. For more information, see Fix Requirements-Based Testing Issues.

If an artifact returns an error during traceability analysis, the panel includes the artifact in an Errors folder. Use the traceability information in these sections to check if the artifacts trace to the units or components that you expect. To see details about the warnings and errors that the dashboard finds during artifact analysis, at the bottom of the dashboard dialog, click Diagnostics.

Functional Requirements

The folder Functional Requirements shows requirements of Type Functional that are either implemented by or upstream of the unit or component.

When you collect metric results, the dashboard analyzes only the functional requirements that the unit or component directly implements. The folder Functional Requirements contains two subfolders to help identify which requirements are implemented by the unit or component, or are upstream of the unit or component:

  • Implemented — Functional requirements that are directly linked to the unit or component with a link Type of Implements. The dashboard uses these requirements in the metrics for the unit or component.

  • Upstream — Functional requirements that are indirectly or transitively linked to the implemented requirements. The dashboard does not use these requirements in the metrics for the unit or component.

Use the Requirements Toolbox™ to create or import the requirements in a requirements file (.slreqx). If a requirement does not trace to a unit or component, it appears in the Trace Issues folder. If a requirement does not appear in the Artifacts panel when you expect it to, see Requirement Missing from Artifacts Panel.

For more information on how the dashboard traces dependencies between project files, see Digital Thread.

Design Artifacts

The folder Design shows project artifacts that trace to the current unit or component, including:

  • The model file that contains the block diagram for the unit or component.

  • Models that the unit or component references.

  • Libraries that are partially or fully used by the model.

  • Data dictionaries that are linked to the model.

  • External MATLAB code that traces to the model. If you expect external MATLAB code to appear in the dashboard and it does not, see External MATLAB Code Missing from Artifacts Panel.

If an artifact does not appear in the Design folder when you expect it to, see Resolve Missing Artifacts, Links, and Results in the Model Testing Dashboard or Resolve Missing Artifacts and Results in the Model Maintainability Dashboard. For more information on how the dashboard traces dependencies between project files, see Digital Thread.

Tests

The folder Tests shows test cases and test harnesses that trace to the selected unit.

When you collect metric results for a unit, the dashboard analyzes only the test cases for unit tests. The folder Tests contains subfolders to help identify whether a test case is considered a unit test and which test harnesses trace to the unit:

  • Unit Tests — Test cases that the dashboard considers as unit tests. A unit test directly tests either the entire unit or lower-level elements in the unit, like subsystems. The dashboard uses these tests in the metrics for the unit.

  • Others — Test cases that trace to the unit but that the dashboard does not consider as unit tests. For example, the dashboard does not consider tests on a library to be unit tests. The dashboard does not use these tests in the metrics for the unit.

  • Test Harnesses — Test harnesses that trace to the unit or lower-level elements in the unit. Double-click a test harness to open it.

Create test cases in a test suite file by using Simulink® Test™. If a test case does not trace to a unit, it appears in the Trace Issues folder. If a test case does not appear in the Artifacts panel when you expect it to, see Test Case Missing from Artifacts Panel. For troubleshooting test cases in metric results, see Fix a test case that does not produce metric results.

For more information on how the dashboard traces dependencies between project files, see Digital Thread.

Test Results

When you collect metric results for a unit, the dashboard analyzes only the test results from unit tests. The folder Test Results contains two subfolders to help identify which test results are from unit tests:

  • Unit Simulation — Simulation results from unit tests. The dashboard uses these results in the metrics for the unit.

    The following types of test results are shown:

    • Saved test file icon Saved test results — results that you have collected in the Test Manager and have exported to a results file.

    • Temporary test results iconTemporary test results — results that you have collected in the Test Manager but have not exported to a results file. When you export the results from the Test Manager the dashboard analyzes the saved results instead of the temporary results. Additionally, the dashboard stops recognizing the temporary results when you close the project or close the result set in the Simulink Test Result Explorer. If you want to analyze the results in a subsequent test session or project session, export the results to a results file.

  • Others — Results that are not simulation results, are not from unit tests, or are only reports. For example, SIL results are not simulation results. The dashboard does not use these results in the metrics for the unit.

If a test result does not trace to a unit, it appears in the Trace Issues folder. If a test result does not appear in the Artifacts panel when you expect it to, see Test Result Missing from Artifacts Panel. For troubleshooting test results in dashboard metric results, see Fix a test result that does not produce metric results.

For more information on how the dashboard traces dependencies between project files, see Digital Thread.

Trace Issues

The folder Trace Issues shows artifacts that the dashboard has not traced to any units or components. Use the folder Trace Issues to check if artifacts are missing traceability to the units or components. The folder Trace Issues contains subfolders to help identify the type of tracing issue:

  • Unexpected Implementation Links — Requirement links of Type Implements for a requirement of Type Container or Type Informational. The dashboard does not expect these links to be of Type Implements because container requirements and informational requirements do not contribute to the Implementation and Verification status of the requirement set that they are in. If a requirement is not meant to be implemented, you can change the link type. For example, you can change a requirement of Type Informational to have a link of Type Related to.

  • Unresolved and Unsupported Links — Requirements links that are either broken in the project or not supported by the dashboard. For example, if a model block implements a requirement, but you delete the model block, the requirement link is now unresolved. The dashboard does not support traceability analysis for some artifacts and some links. If you expect a link to trace to a unit or component and it does not, see the troubleshooting solutions in Resolve Missing Artifacts, Links, and Results in the Model Testing Dashboard.

  • Untraced Tests — Tests that execute on models or lower-level elements, like subsystems, that are not on the project path.

  • Untraced Results — Results that the dashboard cannot trace to a test case. For example, if a test case produces a result, but you delete the test case, the dashboard cannot trace the results to the test case.

The dashboard does not support traceability analysis for some artifacts and some links. If an artifact is untraced when you expect it to trace to a unit or component, see the troubleshooting solutions in Trace Issues.

Artifact Errors

The folder Errors appears if artifacts returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:

  • An artifact returns an error if it has unsaved changes when traceability analysis starts.

  • A test results file returns an error if it was saved in a previous version of Simulink.

  • A model returns an error if it is not on the search path.

Open these artifacts and fix the errors. The dashboard shows a banner at the top of the dashboard to indicate that the artifact traceability shown in the Project and Artifacts panels is outdated. Click the Trace Artifacts button on the banner to refresh the data in the Project and Artifacts panels.

Diagnostics

To see details about artifacts that cause errors, warnings, and informational messages during analysis, at the bottom of the dashboard dialog, click Diagnostics. You can filter the diagnostic messages by their type: Error, Warning, and Info. You can also clear the messages from the viewer.

The diagnostic messages show:

  • Modeling constructs that the dashboard does not support

  • Links that the dashboard does not trace

  • Test harnesses or cases that the dashboard does not support

  • Test results missing coverage or simulation results

  • Artifacts that return errors when the dashboard loads them

  • Information about model callbacks that the dashboard deactivates

  • Files that have file shadowing or path traceability issues

  • Artifacts that are not on the path and are not considered during tracing