How to view, use, and compare model metrics.
0
).
Diagonal cells of the confusion matrix indicate true positive predictions by the model (i.e., the predicted feature matches the ground truth feature). Conversely, non-diagonal cells of the confusion matrix correspond to false positives and false negatives (i.e., the predicted feature does not match the ground truth feature).
The confusion matrix also includes an additional feature that is not a part of your model run ontology: the None
feature. The None
feature is useful in identifying predictions that were not matched to any annotation, as well as annotations that were not matched to any prediction.
The confusion matrix is interactive. If you click on any matrix cell, it opens the gallery view of the model run and keeps only examples corresponding to this specific cell of the confusion matrix.
0
) and one (1
). Predictions with confidence levels below zero are ignored. Predictions uploaded without confidence scores are treated as if their confidence score was set to one (1
).
A true positive occurs when a prediction and an annotation of the same class have an IoU value higher than the selected IoU threshold.
When browsing model run data rows, you can use threshold values to filter the data rows to [different value ranges]. This helps you understand how each threshold affects the automatic metrics and the confusion matrix.
By default, you can toggle between
Access the threshold settings
You can customize the predefined values
A notification banner will appear while Labelbox calculates metrics with an exact count of data rows remaining
Data Type | Annotation Type |
---|---|
Image | Classification, bounding box, segmentation, polygon, polyline, point |
Geospatial | Classification, bounding box, segmentation, polygon, polyline, point |
Text | Classification, named entity (NER) |