Overview
To upload predictions in Labelbox, you need to create a prediction payload. In this section, we provide this payload for every supported annotation type.Predictions Payload Types
Labelbox supports two formats for the annotations payload:-
Python annotation types (recommended)
- Provides a seamless transition between third-party platforms, machine learning pipelines, and Labelbox.
- Allows you to build annotations locally with local file paths, numpy arrays, or URLs
- Easily convert Python Annotation Type format to NDJSON format to quickly import annotations to Labelbox
- Supports one-level nested classification (radio, checklist, or free-form text) under a tool or classification annotation.
-
JSON
- Skips formatting annotation payload in the Labelbox Python annotation type
- Supports any levels of nested classification (radio, checklist, or free-form text) under a tool or classification annotation.
Confidence Score
You can include confidence scores and custom metrics when you upload your model predictions to a model run. However, given the predictions and annotations in a model run, Labelbox will automatically calculate some auto-generated metrics upon upload.Uploading confidence scores is optional
If you do not specify a confidence score, the prediction will be treated as if it had a confidence score of 1.Supported Prediction
The following predictions are supported for an image data row:- Radio
- Checklist
- Free-form text
- Bounding box
- Point
- Polyline
- Polygon
- Segmentation masks
Classification
Radio (single-choice)
Classification: Nested radio
Classification: Nested checklist
Checklist (multiple choice)
Bounding box
Bounding box with nested classification
Polygon
Classification: free-form text
Segmentation mask
Segmentation mask with nested classification
Point
Polyline
Example: Upload predictions to model run
To upload predictions to a model run:Before you start
These examples require the following libraries:API_KEY
with a valid API key to connect to the Labelbox client.
Step 1: Import data rows into Catalog
Step 2: Set up ontology for predictions
Your model run ontology should support all tools and classifications used in your predictions. This example shows how to create an ontology with all supported prediction types .Step 3: Create model and model run
Create a Model using the ontology and a model run.Step 4: Send data rows to model run
Step 5: Create prediction payloads
For help creating prediction payloads, see supported predictions. You can declare payloads as Python annotation types (preferred) or as NDJSON objects. This example demonstrates each format and shows how to compose annotations into labels attached to the data rows. The resultinglabel_prediction_ndjson
and label_prediction
payloads should have exactly the same prediction content (except for the value of generated uuid
string values).
Step 6: Upload prediction payloads to model run
Step 7: Send annotations to a model run
This step is optional. This example creates a project with ground truth annotations to visualize both annotations and predictions in the model run. To send annotations to a model run:1
Import them into a project
2
Create a label payload
3
Send them to the model run.