Main Content

getMetrics

Access metric data for model testing artifacts

Description

example

results = getMetrics(metricEngine,metricIDs) returns metric results for the specified metric.Engine object for the metrics that you specify in metricIDs. To collect metric results for the metric.Engine, use the execute function. Then, access the results by using getMetrics.

example

results = getMetrics(metricEngine,metricIDs,'ArtifactScope',unit) returns metric results for the artifacts in the unit that you specify. A unit is the smallest testable entity in your project that is represented by a model, and it includes referenced models. The artifacts in a unit are the models and the requirements, test cases, and test results that trace to the unit.

Examples

collapse all

Collect metric data on the requirements-based testing artifacts in a project.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Update the trace information for metric_engine to reflect any pending artifact changes and ensure that all test results are tracked.

updateArtifacts(metric_engine)

Collect results for the metric Requirements per test case by executing the metric engine.

execute(metric_engine,{'RequirementsPerTestCase'});

Access the metric results.

results = getMetrics(metric_engine,'RequirementsPerTestCase');
for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end
Test Case: Set button
  Number of Requirements: 0
Test Case: Decrement button hold
  Number of Requirements: 1
Test Case: Resume button
  Number of Requirements: 1
Test Case: Cancel button
  Number of Requirements: 1
Test Case: Decrement button short
  Number of Requirements: 2
Test Case: Increment button hold
  Number of Requirements: 1
Test Case: Increment button short
  Number of Requirements: 2
Test Case: Enable button
  Number of Requirements: 1

Collect metrics for one unit in the project. Specify the unit and collect metrics for only the artifacts that trace to the model.

Open the project that contains the model. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Update the trace information for metric_engine to reflect any pending artifact changes and ensure that all test results are tracked.

updateArtifacts(metric_engine)

Create a variable that represents the path to the model db_DriverSwRequest.

modelPath = fullfile(pwd, 'models', 'db_DriverSwRequest.slx');

Collect results for the metric Requirements per test case by using the execute function on the engine object and limiting the scope to the db_DriverSwRequest model.

execute(metric_engine,{'RequirementsPerTestCase'},'ArtifactScope',{modelPath, 'db_DriverSwRequest'});

Use the function getMetrics to access the results.

results = getMetrics(metric_engine,'RequirementsPerTestCase');
for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end
Test Case: Set button
  Number of Requirements: 0
Test Case: Resume button
  Number of Requirements: 1
Test Case: Decrement button short
  Number of Requirements: 2
Test Case: Enable button
  Number of Requirements: 1
Test Case: Increment button hold
  Number of Requirements: 1
Test Case: Increment button short
  Number of Requirements: 2
Test Case: Cancel button
  Number of Requirements: 1
Test Case: Decrement button hold
  Number of Requirements: 1

Input Arguments

collapse all

Metric engine object for which you want to access metric results, specified as a metric.Engine object.

Metric identifiers for metrics that you want to access, specified as a character vector or cell array of character vectors. For a list of metrics and their identifiers, see Model Testing Metrics.

Example: 'TestCasesPerRequirementDistribution'

Example: {'TestCaseStatus', 'DecisionCoverageBreakdown'}

Path and name of the unit for which you want to access metric results, specified as a cell array where the first entry is the full path to the model file and the second entry is the name of the block diagram. When you use this argument, the metric engine returns results for the artifacts that trace to the unit model.

Example: {'C:\work\MyModel.slx', 'MyModel'}

Output Arguments

collapse all

Metric results, returned as an array of metric.Result objects.

Introduced in R2020b