Manage the Requirements-Based Testing Process with Model Testing Dashboard
Get an overview of the Model Testing Dashboard that summarizes the quality and completeness of your requirements-based testing. The dashboard manages the progress of your testing in accordance with industry-recognized software development standards such as ISO 26262 or DO-178. It analyzes the artifacts in a project from multiple sources and tools, such as requirements, models, and test results, and provides detailed metric measurements on their status.
The dashboard widgets summarize each metric to quickly evaluate the current results of testing, including adherence to guidelines, completion of testing for requirements, and the percentage of requirements covered by tests. You can use the dashboard to gain insight into testing status to identify gaps and respond faster to requirements changes.
Published: 3 Sep 2020
The Model Testing dashboard provides a central place to manage progress, completeness and quality of requirements-based testing across a project to comply with ISO 26262 or DO-178C
When you are verifying that your design meets requirements, there are many artifacts that you need to manage for each design model
And you need full traceability to track that they are consistent and complete
But how do we enable innovation in an environment that requires such high levels of rigor?
Fortunately, Industry standards tell us how to do systematic requirements-based verification.
These practices are typically encoded in <click> checklists that you can use to measure your testing activity against
A test case check list, like this example, <click> tells you what you need to do and when you’re done developing your test cases.
For example, the Test cases should cover all requirements.
And each test case should trace to a requirement.
A test result checklist determines if we are meeting quality goals
It checks for example that all tests pass
And missing coverage is justified
To answer these questions, the model testing dashboard in Simulink Check provides simple visualizations to assess the completeness and quality of your requirements-based testing
Now let’s explore at the dashboard
Here is a cruise control project that includes requirements, design, test and test results.
The Test cases should cover all requirements. And each test case should trace to a requirement.
We can open the dashboard and see that it has analyzed the project to shows the results of testing for each component
On the left is an artifact panel which organizes all the requirements, tests and test results by model component they are related to
The functional requirements show the requirements to be implemented for this component based on the traceability.
The Untraced folder shows you any artifacts missing traceability in the project that require further review.
On the right is the test case and test result status for the selected unit
To meet our test case checklist, this widget tells us whether the test cases cover all requirements.
To see more details, a tool tip appears by hovering.
You can dive down to see the details of which artifacts make up the metrics.
You can see the test cases linked with the requirements.
You can open any of the items directly to take action to resolve issues. For example, let’s open up this test case in Simulink Test
The histograms show a summary of the traceability to quickly assess testing coverage.
This one shows how many tests there are per requirement
The left-most bins show requirements that may have insufficient test cases and the right shows requirements with a large number of tests which may indicate that they are too general and may need to be broken down
You can use this to quickly see that many requirements are without tests.
You can see the coverage of test cases to requirements with this widget.
We can quickly answer our question that each test case is linked to A requirement
We can see one is unlinked
And the histogram shows that there are two tests that are testing multiple requirements.
Let’s address the missing link
The set button test is missing a requirement and we can view it in the test manager.
If we review the requirements missing links then we can set the Set Switch is missing as test. We can go to Requirements Toolbox and add the link and then save the links.
The dashboard detects the update and we collect metrics to see the update.
On the right there are, metrics breaking down the test types and tags are provided to gain further insight in the testing performed
At the bottom is a widget showing the overall pass/fail status of tests linked to requirements to help identify issues which answers our question on whether all tests passed.
The dashboard analyzes the results file that is exported from Simulink Test which may include coverage metrics from Simulink coverage.
The coverage metrics show model element not exercised during testing. We are missing some coverage and may need to justify it.
I can navigate to the test manager to see the coverage results for a specific test case and take action
That’s a quick overview of the dashboard.
To learn more, try this example or visit our website to request a trial