Quickstart guide

To get started with Model Diagnostics, you will need to upload predictions and metrics to Labelbox.

Use the Google Colab below to work through this guide.

Step 1. Prepare predictions

To upload predictions to Diagnostics you need to be able to convert them to the Labelbox NDJson format. The easiest way to do this is to use Labelbox Annotation Types. The tutorials below will help you get started with common model types.

project = client.get_project(project_id)
ontology = project.ontology()

MAX_LABELS = 2000
# Export labels
labels = [l for idx, l in enumerate(project.label_generator()) if idx < MAX_LABELS]

# Make a predicton for each Label
predictions = LabelList()
for label in notebook.tqdm(labels):
    ### --- replace (start) --- ### 
    # Learn more about how to leverage
    # annotation types for data management here
    # https://docs.labelbox.com/docs/annotation-types-1
    raise ValueError
    
    image = label.data.value
    inferences = model.predict(image)
        
    # convert inferences into prediction annotations
    annotations = []
        for inf in inferences:
      # map inferences to annotation types
      value = Polygon(points = [Point(x = x, y = y) for x,y in instance])
      value = Rectangle(start = Point(x = xmin, y = ymin), end = Point(x=xmax, y=ymax))
      value = Point(x=x, y =y)
      value = Line(points = [Point(x = x, y = y) for x,y in instance])
      value = Mask(mask = MaskData.from_2D_arr(seg * 255), color = 255)
          
        annotations.append(ObjectAnnotation(name = class_name, value = value))
        
    # Create labels composed of predictions
    predictions.append(Label(data = image, annotations = annotations))
    # --- replace (end) ---
    

# the signed will upload segmentation masks to Labelbox
signer = lambda c: client.upload_data(content=c, sign=True)

predictions = predictions \
    .add_url_to_masks(signer) \  # upload mask data
  .assign_feature_schema_ids(OntologyBuilder.from_project(project))  # convert class names to feature_schema_ids

Step 2. Create a Model & Model Run

A Model defines an ontology for the predictions and annotations in the Model Runs. To compare performance between model runs they should be uploaded to the same Model. Model and Model Run names are unique. You can use the delete method to clean up unused Models and Model Runs.

# select an ontology
ontology = project.ontology()
model_name = "detection-model"
model_run = "0.0.0"

# create a model with the same ontology as the project
lb_model = client.create_model(name = model_name, ontology_id = ontology.uid)
lb_model_run = lb_model.create_model_run(model_run)

# upload labels
lb_model_run.upsert_labels([label.uid for label in labels])

Step 3. Calculate Metrics

Now that you have predictions and ground truth you can compute metrics to leverage in your analysis. Labelbox provides some common built-in metrics however you may upload any metrics you have using custom metrics.

# Built-in metrics
from labelbox.data.metrics.group import get_label_pairs
from labelbox.data.metrics import feature_miou_metric, feature_confusion_matrix_metric

pairs = get_label_pairs(labels, predictions, filter_mismatch = True)
for ground_truth, prediction in pairs.values():
    metrics = []
    metrics.extend(feature_miou_metric(ground_truth.annotations, prediction.annotations))
    metrics.extend(feature_confusion_matrix_metric(ground_truth.annotations, prediction.annotations))    
    prediction.annotations.extend(metrics)  # metrics are uploaded alongside predictions

Step 4. Upload predictions & metrics

Now that the metrics have been computed, you serialize must serialize both to the Labelbox NDJson format using the NDJsonConverter. Once the upload is complete you can navigate to the Models tab to begin using Diagnostics.

upload_data = NDJsonConverter.serialize(predictions)  # serialize to ndjson
upload_task = lb_model_run.add_predictions(f'mea-import-{uuid.uuid4()}', upload_data)
upload_task.wait_until_done()
upload_task.state


# Open annotations 
for idx, annotation_group in enumerate(lb_model_run.annotation_groups()):
    if idx == 5:
        break
    print(annotation_group.url)

Did this page help you?