Main Content

metric.Result

Metric data for specified metric algorithm and requirements-based testing artifacts

Description

A metric.Result object contains the metric data for a specified metric algorithm and testing artifacts that trace to the specified component.

Creation

Description

example

metric_result = metric.Result creates a handle to a metric result object.

Alternatively, if you collect results by executing a metric.Engine object, using the getMetrics function on the engine object returns the collected metric.Result objects in an array.

Properties

expand all

Metric identifier for the metric algorithm that calculated the results, returned as a string.

Example: 'TestCasesPerRequirementDistribution'

Testing artifacts for which the metric is calculated, returned as a structure or an array of structures. For each artifact that the metric analyzed, the returned structure contains these fields:

  • UUID — Unique identifier of the artifact.

  • Name — Name of the artifact.

  • Type — Type of artifact.

  • ParentUUID — Unique identifier of the file that contains the artifact.

  • ParentName — Name of the file that contains the artifact.

  • ParentType — Type of file that contains the artifact.

Value of the metric result for the specified algorithm and artifacts, returned as an integer, string, double vector, or structure. For a list of metrics and their result values, see Model Testing Metrics.

User data provided by the metric algorithm, returned as a string.

Examples

collapse all

Collect metric data on the requirements-based testing artifacts in a project. Then, access the data by using the metric.Result objects.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Collect results for the metric Requirements per test case by using the execute function on the metric.Engine object.

execute(metric_engine,{'RequirementsPerTestCase'});

Use the function getMetrics to access the results. Assign the array of result objects to the results variable.

results = getMetrics(metric_engine,'RequirementsPerTestCase');

Access the metric results data by using the properties of the metric.Result objects in the array.

for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end
Test Case: Set button
  Number of Requirements: 0
Test Case: Decrement button hold
  Number of Requirements: 1
Test Case: Resume button
  Number of Requirements: 1
Test Case: Cancel button
  Number of Requirements: 1
Test Case: Decrement button short
  Number of Requirements: 2
Test Case: Increment button hold
  Number of Requirements: 1
Test Case: Increment button short
  Number of Requirements: 2
Test Case: Enable button
  Number of Requirements: 1
Introduced in R2020b