Foundry apps

A developer guide for creating and managing Foundry applications.

Foundry apps help automate data labeling and enrichment. Here, we should how to use the Labelbox SDK to manage Foundry apps and how to use a Foundry app to predict and import annotations into a dataset.

An interactive tutorial is also available as a Colab notebook; it shows shows how to run and import annotations from a Foundry app.

Create a Foundry app

A Foundry app (short for Foundry application) helps automate data labeling and enrichment. Once you’ve created a Foundry app, you can run it repeatedly against new data.

To create a Foundry app, use the Labelbox Model to create your Foundry app.

Once the app is created, you can use the Labelbox SDK to run your app and manage its results.

Run a Foundry app

You can use Labelbox Model to run Foundry apps. Results are displayed like any other model run.

When running a Foundry app, you must provide an app ID. You can find the app ID by going to the Model tab and selecting Apps. When you select a Foundry app, the app ID will appear in the top right corner of the screen.

In addition to the app ID, the method also requires a DataRowIdentifiablesobject and a unique name.

task = client.run_foundry_app(model_run_name=f"Amazon-{str(uuid.uuid4())}",
                              data_rows=lb.GlobalKeys(
                                  [global_key] # Provide a list of global keys 
                                  ), 
                              app_id=AMAZON_REKOGNITION_APP_ID)

task.wait_till_done()

print(f"Errors: {task.errors}") 

#Obtain model run ID from task
MODEL_RUN_ID = task.metadata["modelRunId"]

Send Foundry annotations to Annotate

When you send predictions to Annotate from Catalog, you may choose to include or exclude certain parameters.

Parameters

ParameterRequired?Description
predictions_ontology_mappingRequiredA dictionary containing the mapping of the model's ontology feature schema IDs to the project's ontology feature schema IDs. See Send predictions to Annotate for more information on where to access your predictions map.
exclude_data_rows_in_projectOptionalExcludes data rows that are already in the project.
override_existing_annotations_ruleOptionalThe strategy defines how to handle conflicts in classifications between the existing data rows in the project and incoming predictions from the source model run or annotations from the source project.

Options include:
-ConflictResolutionStrategy.KeepExisting (DEFAULT)
-ConflictResolutionStrategy.OverrideWithPredictions
-ConflictResolutionStrategy.OverrideWithAnnotations
batch_priorityOptionalAssigned priority to the batch of data rows (1-5).

Sample script

model_run = client.get_model_run("<model_run_id")

send_to_annotations_params = {
    "predictions_ontology_mapping": PREDICTIONS_ONTOLOGY_MAPPING,
    "exclude_data_rows_in_project": False,
    "override_existing_annotations_rule": ConflictResolutionStrategy.OverrideWithPredictions,
    "batch_priority": 5,
}


task = model_run.send_to_annotate_from_model(
    destination_project_id=project.uid,
    task_queue_id=None, #ID of workflow task, set ID to None if you want to convert pre-labels to ground truths or obtain task queue id through project.task_queues().
    batch_name="Foundry Demo Batch",
    data_rows=lb.GlobalKeys(
        [global_key] # Provide a list of global keys from foundry app task
    ),
    params=send_to_annotations_params
    )

task.wait_till_done()

print(f"Errors: {task.errors}")

End-to-end example: Run and import annotations from a Foundry app

Regardless of what model you use from Foundry, the workflow through the SDK is similar. Step 6 (Map ontology through the UI) is where the process can slightly differ.

Before you start

You will need to import these libraries to use the code examples in this section.

import labelbox as lb
from labelbox.schema.conflict_resolution_strategy import ConflictResolutionStrategy
import uuid

Replace with your API key

API_KEY = ""
client = lb.Client(API_KEY)

Step 1: Import data rows into Catalog

You must have data rows in Catalog before you can run them through Foundry. In this example, we are using an image data row.

# send a sample image as data row for a dataset
global_key = str(uuid.uuid4())

test_img_url = {
    "row_data":
        "https://storage.googleapis.com/labelbox-datasets/image_sample_data/2560px-Kitano_Street_Kobe01s5s4110.jpeg",
    "global_key":
        global_key
}

dataset = client.create_dataset(name="foundry-demo-dataset")
task = dataset.create_data_rows([test_img_url])
task.wait_till_done()

print(f"Errors: {task.errors}")
print(f"Failed data rows: {task.failed_data_rows}")

Step 2: Create/select an ontology that matches the model ontology

Your project should have the correct ontology setup with all the tools and classifications supported for your model and data type.

For example, when using Amazon Rekognition, you must create a bounding box annotation for your ontology since it only supports object detection. Likewise, when using YOLOv8 cls, you must create a classification annotation for your ontology since it only supports image classification.

# Create ontology with two bounding boxes that is included with Amazon Rekognition: Car and Person 
ontology_builder = lb.OntologyBuilder(
    classifications=[],
    tools=[
        lb.Tool(tool=lb.Tool.Type.BBOX, name="Car"),
        lb.Tool(tool=lb.Tool.Type.BBOX, name="Person")
    ]
)

ontology = client.create_ontology("Image Bounding Box Annotation Demo Foundry",
                                  ontology_builder.asdict(),
                                  media_type=lb.MediaType.Image)

Step 3: Create a labeling project

Connect the ontology to the labeling project.

project = client.create_project(name="Foundry Image Demo",
                                media_type=lb.MediaType.Image)

project.setup_editor(ontology)

Step 4: Create Foundry application in UI

Currently, we do not support this workflow through the SDK.

Workflow:

  1. Navigate to model and select Create > App
  2. Select Amazon Rekognition and name your Foundry application
  3. Customize your perimeters and then select Save & Create
#Select your foundry application inside the UI and copy the APP ID from the top right corner
AMAZON_REKOGNITION_APP_ID = ""

Step 5: Run Foundry app on data rows

This step generates annotations that can later be reused as pre-labels or ground in a project.

task = client.run_foundry_app(model_run_name=f"Amazon-{str(uuid.uuid4())}",
                              data_rows=lb.GlobalKeys(
                                  [global_key] # Provide a list of global keys 
                                  ), 
                              app_id=AMAZON_REKOGNITION_APP_ID)

task.wait_till_done()

print(f"Errors: {task.errors}") v

#Obtain model run ID from task
MODEL_RUN_ID = task.metadata["modelRunId"]

Step 6: Map ontology through the UI

Mapping a model's ontology to a project's ontology is currently not supported through the SDK. However, to showcase how to send foundry predictions to a project, we will generate the mapping of the foundry app ontology to the project ontology through the UI.

Workflow:

  1. Use Catalog to select the dataset for your model run.Select Select all in the top right corner.

  2. Select Manage selection > Send to Annotate.

  3. Select your project from the Project menu.

  4. When sending annotations for Annotate review, you typically select a workflow step. This isn't necessary for this example.

  5. Place a checkmark next to Include model predictions and then select Map.

  6. Select the incoming ontology and matching ontology features for Car and Person.

  7. When the features are mapped, select Copy ontology mapping as JSON.

  8. Paste your copied JSON into the definition of PREDICTIONS_ONTOLOGY_MAPPING

    # Copy map ontology through the UI then paste JSON here
    PREDICTIONS_ONTOLOGY_MAPPING = {}
    
  9. In a production workflow, you would typically save your configuration. You can skip this step for the sake of this example.

We will be sending predictions using the SDK in the following steps.

Step 7: Send model-generated annotations from Catalog to Annotate

model_run = client.get_model_run(MODEL_RUN_ID)

send_to_annotations_params = {
    "predictions_ontology_mapping": PREDICTIONS_ONTOLOGY_MAPPING,
    "exclude_data_rows_in_project": False,
    "override_existing_annotations_rule": ConflictResolutionStrategy.OverrideWithPredictions,
    "batch_priority": 5,
}

task = model_run.send_to_annotate_from_model(
    destination_project_id=project.uid,
    task_queue_id=None, #ID of workflow task, set ID to None if you want to convert pre-labels to ground truths or obtain task queue id through project.task_queues().
    batch_name="Foundry Demo Batch",
    data_rows=lb.GlobalKeys(
        [global_key] # Provide a list of global keys from foundry app task
    ),
    params=send_to_annotations_params
    )

task.wait_till_done()

print(f"Errors: {task.errors}")