Main Content

slmetric.metric.ResultClassification Class

Namespace: slmetric.metric

(To be removed) Access metric data thresholds results

The Metrics Dashboard user interface, metricdashboard function, slmetric package API, and corresponding customizations will be removed in a future release. For more information, see Migrating from Metrics Dashboard to Model Maintainability Dashboard.

Description

For the Value and AggregatedValue properties of an slmetric.metric.Result object, access properties of the slmetric.metric.ResultClassification class to determine the metric data ranges that correspond to the Compliant, NonCompliant, and Warning categories. From an slmetric.metric.ResultClassification object, also determine which of the three categories your metric data falls under.

Construction

The value of the Classifications property of an slmetric.metric.Result object is the slmetric.metric.ResultClassification object.

Properties

expand all

Access this property to determine the model metric and the slmetric.metric.Result property that has thresholds.

Metric data values fall into one of these four categories:

  • Compliant — Metric data that is in an acceptable range.

  • Warning — Metric data that requires review.

  • NonCompliant — Metric data that requires you to modify your model.

  • Uncategorized — Metric data that has no threshold values set.

If at least one component is NonCompliant, this property returns NonCompliant. If at least one component is Warning and no components are NonCompliant, this property returns Warning. If all components are Compliant, this category returns Compliant.

This property is read-only.

Examples

collapse all

For the mathworks.metric.SimulinkBlockCount metric, define slmetric.metric.Result values corresponding to Compliant, NonCompliant, and Warning categories. For the sldemo_mdl_ref model, run the metrics engine and categorize results for this metric.

Open the model.

openExample('sldemo_mdlref_basic'); 

Create an slmetric.config.Configuration object.

CONF = slmetric.config.Configuration.new('name', 'Config');

Get the default slmetric.config.ThresholdConfiguration object in CONF.

TC = getThresholdConfigurations(CONF);

Add an slmetric.config.Threshold object to the slmetric.config.ThresholdConfiguration object. This threshold is for the mathworks.metrics.SimulinkBlockCount metric and the Value property of the slmetric.metric.Results object.

T = addThreshold(TC, 'mathworks.metrics.SimulinkBlockCount', 'Value');

An slmetric.config.Threshold object contains a default slmetric.config.Classification object that corresponds to the Compliant category. Use the slmetric.metric.MetricRange class to specify metric values for the Compliant, NonCompliant, and Warning metric ranges.

C = getClassifications(T); % default classification is Compliant
C.Range.Start = 5;
C.Range.IncludeStart = 0;
C.Range.End = 100;
C.Range.IncludeEnd = 0;

C = addClassification(T,'Warning');
C.Range.Start = -inf;
C.Range.IncludeStart = 0;
C.Range.End = 5;
C.Range.IncludeEnd = 1

C = addClassification(T,'NonCompliant');
C.Range.Start = 100;
C.Range.IncludeStart = 1;
C.Range.End = inf;
C.Range.IncludeEnd = 0;

Use the validate method to validate the metric ranges corresponding to the thresholds in the slmetric.config.ThresholdConfiguration object.

validate(T)

If the ranges are not valid, you get an error message. In this example, the ranges are valid.

Save the changes to the configuration file. Use the slmetric.config.setActiveConfiguration function to activate this configuration for the metric engine to use.

configName = 'Config.xml';
save(CONF,'FileName', configName);
slmetric.config.setActiveConfiguration(fullfile(pwd, configName));

Create an slmetric.Engine object, set the root in the model for analysis, and collect data for the mathworks.metrics.SimulinkBlockCount metric.

metric_engine = slmetric.Engine();
setAnalysisRoot(metric_engine, 'Root',  'sldemo_mdlref_basic');
execute(metric_engine, 'mathworks.metrics.SimulinkBlockCount');

Get the model metric data that returns an array of slmetric.metric.ResultCollection objects, res_col.

res_col = getMetrics(metric_engine, 'mathworks.metrics.SimulinkBlockCount');

Display the results for the mathworks.metrics.SimulinkBlockCount metric.

for n=1:length(res_col)
    if res_col(n).Status == 0
        result = res_col(n).Results;
        
        for m=1:length(result)
            disp(['MetricID: ',result(m).MetricID]);
            disp(['  ComponentPath: ', result(m).ComponentPath]);
            disp(['  Value: ', num2str(result(m).Value)]);
            disp(['  Classifications: ', result(m).Classifications.Classification.Category]);
            disp(['  Measures: ', num2str(result(m).Measures)]);
            disp(['  AggregatedMeasures: ', num2str(result(m).AggregatedMeasures)]);
        end
    else
        disp(['No results for:', result(n).MetricID]);
    end
    disp(' ');
end
MetricID: mathworks.metrics.SimulinkBlockCount
  ComponentPath: sldemo_mdlref_basic
  Value: 12
  Classifications: Compliant
  Measures: 
  AggregatedMeasures: 
MetricID: mathworks.metrics.SimulinkBlockCount
  ComponentPath: sldemo_mdlref_basic/More Info
  Value: 0
  Classifications: Warning
  Measures: 
  AggregatedMeasures: 
MetricID: mathworks.metrics.SimulinkBlockCount
  ComponentPath: sldemo_mdlref_counter
  Value: 18
  Classifications: Compliant
  Measures: 
  AggregatedMeasures:

For ComponentPath: sldemo_mdlref_basic and ComponentPath: sldemo_mdlref_counter, the results are Compliant because of the values 12 and 18, respectively. For ComponentPath: sldemo_mdlref_basic/More Info, the results fall under the Warning category because of the 0 value.

Version History

Introduced in R2018b

collapse all

R2022a: Metrics Dashboard will be removed

The Metrics Dashboard user interface, metricdashboard function, slmetric package API, and corresponding customizations will be removed in a future release. For more information, see Migrating from Metrics Dashboard to Model Maintainability Dashboard.