Main Content

metric.Engine

Collect metric data on model testing artifacts

Description

Use a metric.Engine object represents the metric engine that you can execute with the execute object function to collect metric data on the status and quality of requirements-based testing activities. Use getMetrics to access the metric data and return an array of metric.Result objects. The metrics analyze testing artifacts such as requirements, test results, and coverage results. Use the metric data to assess the status and quality of your requirements-based model testing.

Creation

Description

example

metric_engine = metric.Engine() creates a metric engine object that collects metric data on the current project.

example

metric_engine = metric.Engine(projectPath) opens the project projectPath and creates a metric engine object that collects metric data on the project.

Input Arguments

expand all

Path of the project for which you want to collect metric data, specified as a character vector or string scalar.

Properties

expand all

This property is read-only.

Project for which the engine collects metric data, returned as a string.

Object Functions

executeCollect metric data for Model Testing Dashboard
getMetricsAccess metric data for model testing artifacts
generateReportGenerate report file that contains metric results
openArtifactOpen testing artifact traced from the metric result
getArtifactErrorsReturn errors that occurred during artifact tracing

Examples

collapse all

Use a metric.Engine object to collect metric data on the requirements-based testing artifacts in a project.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Collect results for the metric Requirements per test case by executing the metric engine.

execute(metric_engine,{'RequirementsPerTestCase'});

Use the function getMetrics to access the results.

results = getMetrics(metric_engine,'RequirementsPerTestCase');
for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end
Test Case: Set button
  Number of Requirements: 0
Test Case: Decrement button hold
  Number of Requirements: 1
Test Case: Resume button
  Number of Requirements: 1
Test Case: Cancel button
  Number of Requirements: 1
Test Case: Decrement button short
  Number of Requirements: 2
Test Case: Increment button hold
  Number of Requirements: 1
Test Case: Increment button short
  Number of Requirements: 2
Test Case: Enable button
  Number of Requirements: 1

The results show that the test case Set button is missing links to requirements. To fix this, you would link the test case to the requirement that it verifies.

Introduced in R2020b