Skip to main content
Version: 1.37

Monitor a Deployment

Metrics

Metrics can be monitored on the Monitoring tab. You can either choose a custom start and end date and time, or select a predefined period. For some metrics, only certain predefined timeframse can be selected in the metrics' section itself. Within the Monitoring tab, there are three sub-tabs.

Traffic

There are three traffic metrics:

  • Activity: number of predictions
  • Errors: failed requests
  • Response time: average response time per request

Performance

Performance metrics are only available if you have supplied a sufficiently large amount of actuals. For classification models, the following metrics are available:

  • Accuracy
  • Precision
  • Recall
  • F1

For regression models, the following metrics are available:

  • Root mean squared error (RMSE)
  • Mean absolute error (MAE)
  • Mean absolute percentage error (MAPE)
tip

If you want to make sure you can view the correct metrics, define a problem type in your Deployments' metadata

Evaluation

Evaluation metrics are only available if you have supplied a sufficiently large amount of evaluations. Since these metrics are Deeploy-specific, they are explained in detail.

info

A disagreement is a situation in which the outcome of a prediction does not match the desired response indicated by an evaluator.

There are two evaluation metrics:

  • Disagreement ratio: the ratio of 'Disagree' evaluations to the total number of evaluations
  • Disagreement per class: how disagreements are distributed across outcomes
    • For classification models: each of the classes in the graph represents an outcome class of the model, with a limit of 5 classes. If the number of classes is higher, the 4 classes with the most disagreements are shown, with the other outcome classes grouped into one.
    • For regression models: outcomes are grouped into 5 classes, with the number of outcomes split evenly among the classes.
tip

To enhance the disagreement per class graph, define features in your metadata.json

Drift

Drift metrics are only available if you have supplied a sufficiently large amount of predictions.

  • Input validation: The range of a given feature index of your input in presented in a boxplot.
tip

To enhance the experience of your input validation boxplot, define features in your metadata.json

Alerts

Deployment owners can create and manage alerts for metrics on the Alerts tab of the Monitoring page of a Deployment. Alerts are useful to receive notifications when a metric has reached a defined threshold (e.g. when the response time of your model takes longer than 100ms).

As a Deployment owner, you receive emails and notifications for all alerts by default. To change this, update your email and notification preferences on the Account Preferences page.