Skip to main content
Your instructions are the single most important resource for your labeling team. This is where you move beyond simple definitions and provide the context, examples, and rules needed to handle real-world complexity. Well-crafted instructions are the first line of defense against inconsistent or inaccurate annotations.

Create labeling instructions

To add labeling instructions:
  1. Go to your project overview.
  2. Click on Labeling instructions.
  3. In the Instructions section, you may write the instructions directly in the text box, upload a PDF or HTML file, or add a video link.
What to include in your instructions:
  • Detailed Definitions: Go beyond the feature name. What exactly constitutes a “partially occluded vehicle” versus a “fully visible vehicle”?
  • Visual Examples (The Good and The Bad): Show clear examples of correctly labeled data. Just as importantly, show examples of common mistakes or edge cases that should be labeled differently or ignored.
  • Edge Case Guidance: Your data will never be perfect. Provide rules for how to handle blurry images, rare objects, or situations where multiple interpretations are possible.
  • “When in Doubt” Rules: Give your team a clear default action to take when they are unsure, such as flagging the asset for review.
Use the following template as a reference (or download this PDF file).
Changes to instructions are globalRemember that instructions live with the ontology, not the project. A single ontology can be shared across many projects. When you update the instructions for an ontology, those changes will immediately be reflected in every project that uses it.
If you need different instructions for different projects, the correct approach is to create a separate copy of your ontology. Each copy can then have its own unique instructions tailored to that specific project’s needs.

Validate understanding with quizzes

Quizzes serve as an automated check to confirm that every labeler has read and, more importantly, understood your instructions. By requiring a passing score before work can begin, you ensure a baseline level of competency across your entire team and catch misunderstandings early. The quiz is seamlessly integrated into the labeler’s workflow to ensure a smooth onboarding experience for every project.
  1. Review Instructions: The labeler is first prompted to carefully read the instructions you’ve provided.
  2. Take the Quiz: They then proceed to the quiz, where they must answer the questions based on their understanding.
  3. Receive AI-Powered Feedback: Each answer is scored by an AI on a 1-5 scale based on its semantic similarity to the correct answer. A score of 3 or higher is a pass.
  4. Iterate or Proceed: If a labeler fails, they receive immediate, targeted feedback on their incorrect answers and can retake the quiz. They cannot access the labeling editor until they achieve a passing score.

Automatically generate quiz

To auto-generate a quiz for your ontology:
  1. Navigate to Schema > Ontology and select your ontology.
  2. Click the Quiz tab in the ontology editor.
  3. Click Create quiz to automatically generate quiz questions based on your instructions.

Create quiz manually

If you prefer to create your quiz manually:
  1. Navigate to Schema > Ontology and select your ontology.
  2. Click the Quiz tab in the ontology editor.
  3. Click Add Questions Manually or Add Question to create custom quiz questions
Best practices
  • Keep questions focused on specific concepts from your instructions
  • Test understanding of edge cases and quality standards
  • Ensure questions can be answered in 20-250 characters
  • Cover different aspects of your labeling guidelines
  • Note that videos and external links in instructions are not included in quiz generation

Instructions quiz metrics

The instruction quiz tab provides comprehensive analytics for quiz performance when you have enabled quizzes in your ontology instructions. These analytics help you understand how well labelers comprehend your instructions and identify areas where additional clarification may be needed.

How to access your quiz analytics

To view quiz analytics for your project, please follow these steps:
  1. Navigate to your project.
  2. Go to the Performance tab.
  3. Select the Instruction Quiz tab at the top of the page.
  4. Use the date range picker to filter analytics by your desired time period.

An overview of the metrics

The dashboard displays the following key performance indicators at the top:
MetricDescription
Total quiz attemptsThe total number of times labelers have taken the quiz during the selected period.
Overall pass rateThe percentage of quiz attempts that achieved a passing score (3 out of 5 or higher).
Average scoreThe mean score across all quiz attempts on a 1-5 scale.
Unique usersThe number of distinct labelers who have attempted the quiz.
Total questionsThe number of questions in the current quiz.
Average time to passThe average time from the first attempt to the first passing attempt. This excludes the time spent on the first attempt itself.
Average attempts to passThe average number of quiz attempts needed for labelers to pass the quiz for the first time.

Understanding the visual analytics

The dashboard includes two key distribution charts to help you visualize the quiz data.
  • Pass attempt distribution: This chart shows how many attempts labelers need to pass the quiz and helps you identify if the quiz difficulty is appropriate. For example, if most labelers pass on the first attempt, the quiz may be too easy. If they need many attempts, it may be too difficult.
  • Score distribution: This chart displays the range of scores across all attempts and shows how scores are distributed on the 1-5 scale. This helps you identify if most labelers are performing well or struggling.

Analyzing the question performance table

This table shows detailed metrics for each quiz question so you can identify any problematic questions.
MetricDescription
QuestionThe full text of the quiz question. Questions marked “Past question” are from previous quiz versions.
Average scoreThe average score (1-5 scale) for this question across all attempts.
Average attempts to passThe average number of attempts needed for labelers to answer this question correctly.
Improvement trendThe score difference between the first and last attempts, showing if labelers improve over time.
Average time to passThe average time spent to successfully pass this question (displayed in MM:SS format).
You can use this table to identify the following:
  • Questions with low average scores may need clearer instruction content.
  • Questions requiring many attempts suggest the topic needs a better explanation.
  • Negative improvement trends indicate labelers aren’t learning from retakes.

Analyzing the user performance table

This table shows individual labeler performance, so you can identify labelers who may need additional training or support.
MetricDescription
EmailThe labeler’s email address.
AttemptsThe total number of quiz attempts, with the count of passed attempts in parentheses.
Average scoreThe average score across all of this user’s attempts (1-5 scale).
Improvement trendThe score difference between the user’s first and last attempts.
Pass rateThe percentage of this user’s attempts that achieved a passing score.
Time to passThe time from the first attempt to the first successful pass (displayed in MM:SS format).
Quiz statusThis shows “Passed” if the user has successfully passed at least once, and “Not Passed” otherwise.

Using analytics to improve your quiz

Based on the analytics data, you can take several actions to improve your quiz.
  • If overall pass rates are low:
    • Review your instructions for clarity and completeness.
    • Consider breaking down complex concepts into simpler explanations.
    • Add more examples to illustrate key points.
  • If specific questions have low scores:
    • Revise the related section in your instructions.
    • Ensure the question accurately tests the intended knowledge.
    • Adjust the expected answer to be more flexible.
  • If labelers need many attempts to pass:
    • Simplify your quiz questions or make instructions more explicit.
    • Add practice examples to your instructions.
    • Consider reducing the passing threshold if appropriate.
  • If improvement trends are negative:
    • Review the feedback provided by the AI scoring.
    • Ensure questions test understanding, not memorization.
    • Consider whether the quiz is testing the right concepts.