Main Content

getAvailableMetricIds

Return metric identifiers for available metrics

Since R2021b

    Description

    availableMetricIds = getAvailableMetricIds(metricEngine) returns the metric identifiers for the metrics available for the specified Metric.engine object. By default, the list includes only the metrics available with the current installation.

    example

    availableMetricIds = getAvailableMetricIds(metricEngine,'App','DashboardApp','Dashboard',dashboardIdentifier) returns the metric identifiers associated with the dashboardIdentifier.

    For example, this code returns the metric identifiers for the Model Testing Dashboard:

    availableMetricIds = getAvailableMetricIds(metricEngine,...
    'App','DashboardApp',...
    'Dashboard','ModelUnitTesting');

    For an additional syntax to display metric identifiers for design cost estimation, see getAvailableMetricIds (Fixed-Point Designer).

    availableMetricIds = getAvailableMetricIds(___,'Installed',installationStatus) returns the metric identifiers, filtered by the installationStatus.

    For example, specifying installationStatus as false allows you to return the metric identifiers for each of the available metrics, even if the associated MathWorks® products are not currently installed on your machine.

    Examples

    collapse all

    Collect metric results on the requirements-based testing artifacts in a project.

    Open a project that includes the models and testing files. For this example, in the MATLAB® Command Window, enter:

    openExample("slcheck/ExploreTestingMetricDataInModelTestingDashboardExample");
    openProject("cc_CruiseControl");

    Create a metric.Engine object for the project.

    metric_engine = metric.Engine();

    Update the trace information for metric_engine to ensure that the artifact information is up to date.

    updateArtifacts(metric_engine)

    Create a list of the available metric identifiers for the Model Testing Dashboard by specifying the dashboard identifier as 'ModelUnitTesting'.

    metric_ids = getAvailableMetricIds(metric_engine,...
    'App','DashboardApp',...
    'Dashboard','ModelUnitTesting');

    Collect results by executing the metric engine on the list of metric identifiers.

    execute(metric_engine,metric_ids);

    Input Arguments

    collapse all

    Metric engine object for which you want to collect metric results, specified as a metric.Engine object.

    Identifier for the dashboard, specified as one of these values:

    • "ModelMaintainability" — Return each of the model maintainability metric identifiers.

    • "ModelUnitPILTesting" — Return each of the processor-in-the-loop (PIL) code testing metric identifiers.

    • "ModelUnitSILTesting" — Return each of the software-in-the-loop (SIL) code testing metric identifiers.

    • "ModelUnitTesting" — Return the model testing metric identifiers associated with your project.

    Note that if you want to hide requirements metrics from the API results, you can select Hide requirements metrics in the dashboard Options. For more information, see Hide Requirements Metrics in Model Testing Dashboard and in API Results.

    Example: 'ModelUnitTesting'

    Filter for metric installation status, specified as one of these values:

    • 1 (true) — Returns only metric identifiers associated with the MathWorks products currently installed on your machine.

    • 0 (false) — Returns metric identifiers for each of the available metrics, even if the associated MathWorks products are not currently installed on your machine. You can use the list of each of the available metric identifiers to access the metric results collected on a different machine.

    Example: false

    Data Types: logical

    Output Arguments

    collapse all

    Metric identifiers for the available metrics, returned as a string or string array.

    For information on the metrics and their identifiers, see:

    Example: "slcomp.mt.CoverageBreakdown"

    Example: ["slcomp.mt.CoverageBreakdown", "RequirementWithTestCaseDistribution", "RequirementsPerTestCaseDistribution", "slcomp.mt.TestsStatusDistribution", "TestCaseTagDistribution", "TestCaseTypeDistribution", "TestCaseVerificationStatusDistribution", "TestCaseWithRequirementDistribution", "TestCasesPerRequirementDistribution"]

    Version History

    Introduced in R2021b

    expand all