Main Content

metric.Engine

Collect metric data

Description

Use a metric.Engine object represents the metric engine that you can execute with the execute object function to collect metric data. Use getMetrics to access the metric data and return an array of metric.Result objects. Use the metric data to assess the status and quality of your design. Use the model testing metrics to analyze testing artifacts such as requirements, test results, and coverage results. Use the model maintainability metrics to analyze the maintainability and complexity of the design. For additional metrics, see Design Cost Model Metrics (Fixed-Point Designer).

Creation

Description

example

metric_engine = metric.Engine() creates a metric engine object that collects metric data on the current project.

example

metric_engine = metric.Engine(projectPath) opens the project projectPath and creates a metric engine object that collects metric data on the project.

Input Arguments

expand all

Path of the project for which you want to collect metric data, specified as a character vector or string scalar.

Properties

expand all

This property is read-only.

Project for which the engine collects metric data, returned as a string.

Object Functions

executeCollect metric data
generateReportGenerate report file containing metric results
getArtifactErrorsReturn errors that occurred during artifact tracing
getAvailableMetricIdsReturn metric identifiers for available metrics
getMetricsAccess metric data
openArtifactOpen traced artifact from metric result
updateArtifactsUpdate trace information for pending artifact changes in the project

Examples

collapse all

Use a metric.Engine object to collect metric data on the design artifacts in a project.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Collect results for the metric "slcomp.OverallCyclomaticComplexity" by executing the metric engine. For more information on the metric, see Model Maintainability Metrics.

execute(metric_engine,'slcomp.OverallCyclomaticComplexity');

Use the function getMetrics to access the results.

results = getMetrics(metric_engine,'slcomp.OverallCyclomaticComplexity');
for n = 1:length(results)
    disp(['Model: ',results(n).Scope.Name])
    disp(['  Overall Design Cyclomatic Complexity: ',num2str(results(n).Value)])
end
Model: db_Controller
  Overall Design Cyclomatic Complexity: 1
Model: db_LightControl
  Overall Design Cyclomatic Complexity: 4
Model: db_ThrottleController
  Overall Design Cyclomatic Complexity: 4
Model: db_ControlMode
  Overall Design Cyclomatic Complexity: 22
Model: db_DriverSwRequest
  Overall Design Cyclomatic Complexity: 9

For more information on how to collect metrics for design artifacts, see Collect Model Maintainability Metrics Programmatically.

Use a metric.Engine object to collect metric data on the requirements-based testing artifacts in a project.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Update the trace information for metric_engine to ensure that the artifact information is up to date.

updateArtifacts(metric_engine)

Collect results for the metric RequirementsPerTestCase by executing the metric engine.

execute(metric_engine,{'RequirementsPerTestCase'});

Use the function getMetrics to access the results.

results = getMetrics(metric_engine,'RequirementsPerTestCase');
for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end
Test Case: Set button
  Number of Requirements: 0
Test Case: Decrement button hold
  Number of Requirements: 1
Test Case: Resume button
  Number of Requirements: 1
Test Case: Cancel button
  Number of Requirements: 1
Test Case: Decrement button short
  Number of Requirements: 2
Test Case: Increment button hold
  Number of Requirements: 1
Test Case: Increment button short
  Number of Requirements: 2
Test Case: Enable button
  Number of Requirements: 1

The results show that the test case Set button is missing links to requirements. To fix this, you would link the test case to the requirement that it verifies.

Version History

Introduced in R2020b