Skip to main content
Your instructions are the single most important resource for your labeling team. This is where you move beyond simple definitions and provide the context, examples, and rules needed to handle real-world complexity. Well-crafted instructions are the first line of defense against inconsistent or inaccurate annotations.

Create labeling instructions

To add labeling instructions:
  1. Go to your project overview.
  2. Click on Labeling instructions.
  3. In the Instructions section, you may write the instructions directly in the text box, upload a PDF or HTML file, or add a video link.
What to include in your instructions:
  • Detailed Definitions: Go beyond the feature name. What exactly constitutes a “partially occluded vehicle” versus a “fully visible vehicle”?
  • Visual Examples (The Good and The Bad): Show clear examples of correctly labeled data. Just as importantly, show examples of common mistakes or edge cases that should be labeled differently or ignored.
  • Edge Case Guidance: Your data will never be perfect. Provide rules for how to handle blurry images, rare objects, or situations where multiple interpretations are possible.
  • “When in Doubt” Rules: Give your team a clear default action to take when they are unsure, such as flagging the asset for review.
Use the following template as a reference (or download this PDF file).
Changes to instructions are globalRemember that instructions live with the ontology, not the project. A single ontology can be shared across many projects. When you update the instructions for an ontology, those changes will immediately be reflected in every project that uses it.
If you need different instructions for different projects, the correct approach is to create a separate copy of your ontology. Each copy can then have its own unique instructions tailored to that specific project’s needs.

Validate understanding with quizzes

Quizzes serve as an automated check to confirm that every labeler has read and, more importantly, understood your instructions. By requiring a passing score before work can begin, you ensure a baseline level of competency across your entire team and catch misunderstandings early. The quiz is seamlessly integrated into the labeler’s workflow to ensure a smooth onboarding experience for every project.
  1. Review Instructions: The labeler is first prompted to carefully read the instructions you’ve provided.
  2. Take the Quiz: They then proceed to the quiz, where they must answer the questions based on their understanding.
  3. Receive AI-Powered Feedback: Each answer is scored by an AI on a 1-5 scale based on its semantic similarity to the correct answer. A score of 3 or higher is a pass.
  4. Iterate or Proceed: If a labeler fails, they receive immediate, targeted feedback on their incorrect answers and can retake the quiz. They cannot access the labeling editor until they achieve a passing score.

Automatically generate quiz

To auto-generate a quiz for your ontology:
  1. Navigate to Schema > Ontology and select your ontology.
  2. Click the Quiz tab in the ontology editor.
  3. Click Create quiz to automatically generate quiz questions based on your instructions.

Create quiz manually

If you prefer to create your quiz manually:
  1. Navigate to Schema > Ontology and select your ontology.
  2. Click the Quiz tab in the ontology editor.
  3. Click Add Questions Manually or Add Question to create custom quiz questions
Best practices
  • Keep questions focused on specific concepts from your instructions
  • Test understanding of edge cases and quality standards
  • Ensure questions can be answered in 20-250 characters
  • Cover different aspects of your labeling guidelines
  • Note that videos and external links in instructions are not included in quiz generation

Quiz analytics

Quiz analytics are available in each project’s Performance dashboard. For complete documentation on all available metrics, charts, and guidance on interpreting the data, see Instruction Quiz analytics in the Performance Dashboard documentation. When labelers enter a project with quiz-enabled ontology instructions, they must complete and pass the quiz before they can begin labeling. Since quiz attempts are tracked at the project level, you can view comprehensive analytics for each project to track performance and identify areas for improvement. Analytics include:
  • Overall pass rates and average scores
  • Individual user performance and improvement trends
  • Question-level analytics showing which questions are most challenging
  • Score and attempt distribution charts
  • Time-based statistics to identify trends
These insights help you refine your instructions and quiz questions based on actual labeler performance data.