Create labeling instructions
To add labeling instructions:- Go to your project overview.
- Click on Labeling instructions.
- In the Instructions section, you may write the instructions directly in the text box, upload a PDF or HTML file, or add a video link.
- Detailed Definitions: Go beyond the feature name. What exactly constitutes a “partially occluded vehicle” versus a “fully visible vehicle”?
- Visual Examples (The Good and The Bad): Show clear examples of correctly labeled data. Just as importantly, show examples of common mistakes or edge cases that should be labeled differently or ignored.
- Edge Case Guidance: Your data will never be perfect. Provide rules for how to handle blurry images, rare objects, or situations where multiple interpretations are possible.
- “When in Doubt” Rules: Give your team a clear default action to take when they are unsure, such as flagging the asset for review.
Expand to view labeling instruction template
Expand to view labeling instruction template
Changes to instructions are globalRemember that instructions live with the ontology, not the project. A single ontology can be shared across many projects. When you update the instructions for an ontology, those changes will immediately be reflected in every project that uses it.
Validate understanding with quizzes
Quizzes serve as an automated check to confirm that every labeler has read and, more importantly, understood your instructions. By requiring a passing score before work can begin, you ensure a baseline level of competency across your entire team and catch misunderstandings early. The quiz is seamlessly integrated into the labeler’s workflow to ensure a smooth onboarding experience for every project.- Review Instructions: The labeler is first prompted to carefully read the instructions you’ve provided.
- Take the Quiz: They then proceed to the quiz, where they must answer the questions based on their understanding.
- Receive AI-Powered Feedback: Each answer is scored by an AI on a 1-5 scale based on its semantic similarity to the correct answer. A score of 3 or higher is a pass.
- Iterate or Proceed: If a labeler fails, they receive immediate, targeted feedback on their incorrect answers and can retake the quiz. They cannot access the labeling editor until they achieve a passing score.
Automatically generate quiz
To auto-generate a quiz for your ontology:- Navigate to Schema > Ontology and select your ontology.
- Click the Quiz tab in the ontology editor.
- Click Create quiz to automatically generate quiz questions based on your instructions.
Create quiz manually
If you prefer to create your quiz manually:- Navigate to Schema > Ontology and select your ontology.
- Click the Quiz tab in the ontology editor.
- Click Add Questions Manually or Add Question to create custom quiz questions
Instructions quiz metrics
The instruction quiz tab provides comprehensive analytics for quiz performance when you have enabled quizzes in your ontology instructions. These analytics help you understand how well labelers comprehend your instructions and identify areas where additional clarification may be needed.How to access your quiz analytics
To view quiz analytics for your project, please follow these steps:- Navigate to your project.
- Go to the Performance tab.
- Select the Instruction Quiz tab at the top of the page.
- Use the date range picker to filter analytics by your desired time period.
An overview of the metrics
The dashboard displays the following key performance indicators at the top:| Metric | Description |
|---|---|
| Total quiz attempts | The total number of times labelers have taken the quiz during the selected period. |
| Overall pass rate | The percentage of quiz attempts that achieved a passing score (3 out of 5 or higher). |
| Average score | The mean score across all quiz attempts on a 1-5 scale. |
| Unique users | The number of distinct labelers who have attempted the quiz. |
| Total questions | The number of questions in the current quiz. |
| Average time to pass | The average time from the first attempt to the first passing attempt. This excludes the time spent on the first attempt itself. |
| Average attempts to pass | The average number of quiz attempts needed for labelers to pass the quiz for the first time. |
Understanding the visual analytics
The dashboard includes two key distribution charts to help you visualize the quiz data.- Pass attempt distribution: This chart shows how many attempts labelers need to pass the quiz and helps you identify if the quiz difficulty is appropriate. For example, if most labelers pass on the first attempt, the quiz may be too easy. If they need many attempts, it may be too difficult.
- Score distribution: This chart displays the range of scores across all attempts and shows how scores are distributed on the 1-5 scale. This helps you identify if most labelers are performing well or struggling.
Analyzing the question performance table
This table shows detailed metrics for each quiz question so you can identify any problematic questions.| Metric | Description |
|---|---|
| Question | The full text of the quiz question. Questions marked “Past question” are from previous quiz versions. |
| Average score | The average score (1-5 scale) for this question across all attempts. |
| Average attempts to pass | The average number of attempts needed for labelers to answer this question correctly. |
| Improvement trend | The score difference between the first and last attempts, showing if labelers improve over time. |
| Average time to pass | The average time spent to successfully pass this question (displayed in MM:SS format). |
- Questions with low average scores may need clearer instruction content.
- Questions requiring many attempts suggest the topic needs a better explanation.
- Negative improvement trends indicate labelers aren’t learning from retakes.
Analyzing the user performance table
This table shows individual labeler performance, so you can identify labelers who may need additional training or support.| Metric | Description |
|---|---|
| The labeler’s email address. | |
| Attempts | The total number of quiz attempts, with the count of passed attempts in parentheses. |
| Average score | The average score across all of this user’s attempts (1-5 scale). |
| Improvement trend | The score difference between the user’s first and last attempts. |
| Pass rate | The percentage of this user’s attempts that achieved a passing score. |
| Time to pass | The time from the first attempt to the first successful pass (displayed in MM:SS format). |
| Quiz status | This shows “Passed” if the user has successfully passed at least once, and “Not Passed” otherwise. |
Using analytics to improve your quiz
Based on the analytics data, you can take several actions to improve your quiz.- If overall pass rates are low:
- Review your instructions for clarity and completeness.
- Consider breaking down complex concepts into simpler explanations.
- Add more examples to illustrate key points.
- If specific questions have low scores:
- Revise the related section in your instructions.
- Ensure the question accurately tests the intended knowledge.
- Adjust the expected answer to be more flexible.
- If labelers need many attempts to pass:
- Simplify your quiz questions or make instructions more explicit.
- Add practice examples to your instructions.
- Consider reducing the passing threshold if appropriate.
- If improvement trends are negative:
- Review the feedback provided by the AI scoring.
- Ensure questions test understanding, not memorization.
- Consider whether the quiz is testing the right concepts.