Import ground truth

Import your previously generated ground truth data into Labelbox from internal or 3rd party tools.

πŸ“˜

Only supported in Python SDK 3.7.0 or later

How it works

The annotation import method allows you to import your ground truth annotations created from internal or 3rd party labeling systems into Labelbox Annotate to organize all of your data in one place. Using the label import API to import external data is a useful way to consolidate and migrate all annotations into Labelbox as a single source of truth.

You can use this import Ground Truth annotations to help you get set up with the Model Diagnostics workflow.

Imported annotations will appear when the asset is opened in the Editor as long as the following conditions are met:

  • The imported annotations are assigned to a Data Row within a dataset that is attached to the project
  • The asset has not already been labeled in the Labelbox Editor

Step 1: Prepare your NDJSON payload

The annotation import works similar to the import method used for Model-assisted labeling. Prepare the annotation payload in NDJSON format.

See the model-assisted labeling annotation type support documentation to create the payload and learn about the supported annotation types.

Step 2: Import your annotations

Use the Python SDK to create an import job for your annotations. The recipe below provides an example of importing data into Labelbox via the command line.

πŸ“˜

Note

Before you begin a new import job to import annotations to a Data Row, make sure there are no existing MAL annotations on the Data Row. Duplicate import jobs may overwrite existing labels or result in unexpected behavior.

πŸ“˜

Note

When you run an import job, the Activity page in Labelbox will not reflect any changes until the entire job is complete.

Billing

You can view the number of annotations imported for billing purposes by visiting the billing usage page. Please note that the billing system may indicate different counts for certain annotation types when compared to the annotation count on the project overview page. This is expected behavior.


Did this page help you?