Main Content

Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard

When you develop and test software components using Model-Based Design, use the Model Testing Dashboard to assess the status and quality of your model testing activities. Requirements-based testing is a central element of model verification. By establishing traceability links between your requirements, model design elements, and test cases, you can measure the extent to which the requirements are implemented and verified. The Model Testing Dashboard analyzes this traceability information and provides detailed metric measurements on the traceability, status, and results of these testing artifacts.

Model Testing Dashboard

Each metric in the dashboard measures a different aspect of the quality of your model testing and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. To monitor the requirements-based testing quality of your models in the Model Testing Dashboard, maintain your artifacts in a project and follow these considerations. For more information on using the Model Testing Dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.

Manage Artifact Files in a Project

To analyze your requirements-based testing activities in the Model Testing Dashboard, store your design and testing artifacts in a project. The artifacts that the testing metrics analyze include:

  • Models

  • Requirements that you create in Simulink® Requirements™

  • Libraries that the models use

  • Test cases that you create in Simulink Test™

  • Test results from the executed test cases

In order to analyze the latest assets in the Model Testing Dashboard, check that you:

  • Save the changes to your artifact files.

  • Export test results and save them in a results file.

  • Store the files that you want to analyze in the project.

Model Software Components for Requirements-Based Testing

The Model Testing Dashboard provides traceability and testing analysis for each component in your project. A component is a functional entity within your software architecture that you can execute and test independently or as part of larger system tests. For each component, you develop functional requirements based on the high-level system requirements and the role of the component. You then model the component algorithm to fulfill the functional requirements. Then, to test the component, you derive the test cases from the requirements and run the tests on the model. Throughout this process, you create and maintain explicit or implicit traceability links between:

  • Each functional requirement and the model elements that implement it

  • Each functional requirement and the test cases that verify it

  • Each test case and the model that it tests

  • Each test case and the latest results that it produced

These traceability links allow you to track the completeness of your requirements, design, and testing activities. Links help you find gaps in design and testing. If a test fails, you can follow the traceability links to the test case that failed, the requirement that it tested, and to the model element that implemented the requirement. This allows you to quickly find possible design errors that caused a test failure. Industry standards for software development such as ISO 26262 and DO-178C require traceability between these artifacts to show testing completeness.

Label Software Component Models for Analysis in the Dashboard

Specify which models in your project are software components so that the dashboard can analyze their testing status. Label the component models in your project and configure the dashboard to find models that have the label.

  1. In your project, create a label that you can use to identify software component models. For example, Software Component. For an example of how to create a label, see Create Labels.

  2. Add the label to the software component models. For an example of how to add a label, see Add Labels to Files.

  3. In the Model Testing Dashboard, click Options. Select the category and label that you created for identifying software component models.

  4. Click Trace Artifacts. The dashboard updates the list of components in the Artifacts pane to show only the models that have the software component label that you added. Then the dashboard updates the traceability data for those component models.

Controlling the list of models that the dashboard analyzes makes it easier for you to track your testing progress for only software components that need requirements-based testing.

Trace Artifacts to Components for Model Testing Analysis

To determine which artifacts are in the scope of a component, the Model Testing Dashboard analyzes the traceability links between the artifacts and the software component models in the project. The Artifacts panel lists each component, represented by the model name, and these artifacts that trace to the component:

  • Functional Requirements

  • Design Artifacts

  • Test Cases

  • Test Results

Artifacts panel showing components and traced artifacts

To see the traceability path that the dashboard found from an artifact to its component, right-click the artifact and click View trace to component. A traceability graph opens in a new tab in the Model Testing Dashboard. The graph shows the connections and intermediate artifacts that the dashboard traced from the component to the artifact. To see the type of traceability that connects two artifacts, place your cursor over the arrow that connects the artifacts. The traceability relationship is either one artifact containing the other or one artifact tracing to the other. For example, the trace view for the functional requirement CC003_05 shows that it is contained in the requirement Activating cruise control. The container requirement traces to the functional requirement Set Switch Detection, which traces to the component db_DriverSwRequest.

Dashboard trace view for a functional requirement.

After the list of components, the Untraced folder shows artifacts that the dashboard has not traced to of the models. If an artifact returns an error during traceability analysis, the panel includes the artifact in the Errors folder. Use the traceability information in these sections and in the components to check if the testing artifacts trace to the models that you expect. To see details about the warnings and errors that the dashboard finds during artifact analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics.

As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the traceability data in the Artifacts panel might be stale by enabling the Trace Artifacts button. To update the traceability data, click Trace Artifacts. If the button is not enabled, the dashboard has not detected changes that affect the traceability information.

Functional Requirements

The folder Functional Requirements shows requirements where the Type is set to Functional and that trace to the component model directly or through a container requirement, a library subsystem, or a combination of the two. For more information about linking requirements, see Requirement Links (Simulink Requirements).

If a requirement does not trace to a component, it appears in the Untraced Artifacts folder. If a requirement does not appear in the Artifacts panel when you expect it to, see Requirement Missing from Artifacts Pane.

When you collect metric results for a component, the dashboard analyzes a subset of the requirements that appear in the Functional Requirements folder. The metrics analyze only requirements where the Type is set to Functional and that are directly linked to the model with a link where the Type is set to Implements. A requirement that traces to the component but does not have these settings appears in the Functional Requirements folder but does not contribute the metric results for requirements. For troubleshooting metric results for requirements, see Fix a requirement that does not produce metric results.

Design Artifacts

The folder Design shows:

  • The model file that contains the block diagram for the component.

  • Libraries that are partially or fully used by the model.

  • Data dictionaries that are linked to the model.

Test Cases

The folder Test Cases shows test cases that trace to the model. This includes test cases that run on the model and test cases that run on subsystems in the model by using test harnesses. Create these test cases in a test suite file by using Simulink Test.

If a test case does not trace to a component, it appears in the Untraced Artifacts folder. If a test case does not appear in the Artifacts panel when you expect it to, see Test Case Missing from Artifacts Pane.

When you collect metric results for a component, the dashboard analyzes a subset of the test cases that appear in the Test Cases folder. The dashboard analyzes only test cases that run on the model. Subsystem test harnesses appear in the folder but do not contribute to the metrics because they do not test the whole model. For troubleshooting test cases in metric results, see Fix a test case that does not produce metric results.

Test Results

The folder Test Results shows these types of test results from test cases that test the model:

  • Saved test file icon Saved test results — results that you have collected in the Test Manager and have exported to a results file.

  • Temporary test results iconTemporary test results — results that you have collected in the Test Manager but have not exported to a results file. When you export the results from the Test Manager the dashboard analyzes the saved results instead of the temporary results. Additionally, the dashboard stops recognizing the temporary results when you close the project or close the result set in the Simulink Test Result Explorer. If you want to analyze the results in a subsequent test session or project session, export the results to a results file.

If a test result does not trace to a component, it appears in the Untraced Artifacts folder. If a test result does not appear in the Artifacts panel when you expect it to, see Test Result Missing from Artifacts Pane.

When you collect metric results for a component, the dashboard analyzes a subset of the test results that appear in the Test Results folder. For troubleshooting test results in dashboard metric results, see Fix a test result that does not produce metric results.

Untraced Artifacts

The folder Untraced shows artifacts that the dashboard has not traced to models. Use the Untraced folder to check if artifacts are missing traceability to the components. When you add traceability to an artifact, update the information in the panel by clicking Trace Artifacts. The Model Testing Dashboard does not support traceability analysis for some artifacts and some links. If an artifact is untraced when you expect it to trace to a component, see the troubleshooting solutions in Untraced Artifacts.

Artifact Errors

The folder Errors shows artifacts that returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:

  • An artifact returns an error if it has unsaved changes when traceability analysis starts.

  • A test results file returns an error if it was saved in a previous version of Simulink.

  • A model returns an error if it is not on the search path.

Open these artifacts and fix the errors. Then, to analyze the traceability in the dashboard, click Trace Artifacts.

Diagnostics

To see details about artifacts that cause warnings or errors during analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics. The diagnostics viewer displays errors, warnings, and information messages. You can filter the diagnostic messages by type and clear the messages from the viewer.

The diagnostic messages show:

  • Modeling constructs that the dashboard does not support

  • Links that the dashboard does not trace

  • Test harnesses or cases that the dashboard does not support

  • Test results missing coverage or simulation results

  • Artifacts that return errors when the dashboard loads them

  • Information about model callbacks that the dashboard deactivates

Collect Metric Results

The Model Testing Dashboard collects metric results for each component listed in the Artifacts pane. Each metric in the dashboard measures a different aspect of the quality of the testing of your model and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178. For more information about the available metrics and the results that they return, see Model Testing Metrics.

As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the metric results in the dashboard might be stale. If your changes affect the traceability information in the Artifacts panel, click Trace Artifacts. After you update the traceability information, if the metric results might be affected by your artifact changes, the Stale Metrics icon Stale Metrics appears at the top of the dashboard. Affected widgets appear highlighted in gray. To update the results, click Collect Results > Collect All Results.

The dashboard does not indicate stale metric data for these changes:

  • After you run a test case and analyze the results in the dashboard, if you make changes to the test case, the dashboard indicates that test case metrics are stale but does not indicate that the results metrics are stale.

  • When you change a coverage filter file that your test results use, the coverage metrics in the dashboard do not indicate stale data or include the changes. After you save the changes to the filter file, re-run the tests and use the filter file for the new results.

When you collect metric results for a component, the dashboard returns results for a subset of the artifacts that trace to the component. However, metric results that count traceability links between requirements and test cases include links to artifacts that might trace to other components or to no components. For example, if a test case TestCaseA tests ModelA, then running the metric Test case linked to requirements on ModelA returns a result for that test case. When the metric checks for requirements that are linked to TestCaseA, the metric does not consider the implementation or traceability status of the requirements. If TestCaseA has a Verifies link to a requirement RequirementB, which is linked to a different model, then the metric returns true indicating that the test case is linked. However, if you run the metric Requirement linked to test cases on ModelA, it does not return a result for RequirementB because the requirement is not linked to ModelA. For a test case that is linked to requirements, check that the linked requirements are implemented by the model that the test case runs on. Additionally, for a requirement that is linked to test cases, check that the test cases run on the model that implements the requirement.

See Also

Related Topics