Reviewing

Overview of reviewing annotations on Ango Hub.

Reviewing is a key component in ensuring your labels are kept at the highest possible level of quality.

Reviewing is the act of looking at a labeling task again and confirming that it was done correctly, or optionally, to fix it if it wasn't done correctly. Only users with the roles Reviewer, Lead, or Manager can perform reviews.

How to review tasks (Reviewer, Lead, or Manager roles only)

From your project's dashboard, click on Start Reviewing on the top-right corner of the screen.

You will be shown the first unreviewed labeling task in the review queue.

If your project's workflow has more than one Review stage, this means the project has more than one review queue. The review queue will be picked at random when clicking on Start Reviewing.

If you wish to pick a specific review queue to work from, click on the downward-pointing chevron next to the Start Reviewing button and select the stage you'd like to review the tasks of.

From here, you can accept the labels as they are by clicking on Accept, or you can reject the labeling task by clicking on Reject in the top-right corner of the screen.

Accept and Reject are flags attached to the particular task. What they mean for your project depends on how the project's workflow was set up.

For example, you may set a workflow up in such a way that rejected tasks are sent to labeling again, and accepted ones are marked as done. See more on the documentation page for workflows.

If the current review stage has the Read-Only toggle turned on, then you will not be able to make modifications to the annotations. (More on setting up read-only reviewing in the Workflow documentation page)

If, however, the current stage is not read only, you may choose to fix the labeling task, make your modifications, then click on either the Accept or Reject button.

After submitting your choice, you will be shown the next unreviewed task, until none are left in the current queue. If the project has more than one review queue in which you may perform reviews, clicking on Start Reviewing again will open the review queue of a different stage where there are still tasks left to review

Review Error Codes

Reviewers can assign specific error codes to tasks they review (e.g. Slightly Incorrect, Very Incorrect, Empty, etc.) See more on error codes in its docs page here:

pageIssue Error Codes

Reviewing Consensus Tasks

In many projects, the "Disagreement" output of a Consensus stage will be linked to the input of a "Review" stage, such that if no agreement was found among annotators, the task will be sent to a reviewer.

Which judgment gets reviewed?

For each task that is output from a Consensus stage and input into a Review stage, the judgment that is sent to the reviewer is the last judgment to have been submitted in the Consensus stage.

So, for example, if a task was sent to the Consensus stage, and both Labeler 1, 2, and 3 annotated it, but it happened that Labeler 2 was the last among them to click on "Submit", then the task from Labeler 2 is the one that will be shown to the reviewer.

How can I compare different judgments?

While the task sent to reviewers is the last one to have been submitted (see previous section), reviewers have two ways, currently, to look at other annotators' judgments and make comparisons.

For classifications, you will see a percentage next to the classification's name. Clicking on it will reveal a popover:

From this popover, you are able to see who answered what question and how. The percentage at the top-left is the consensus score for that classification. If threshold has been met for that classification, the percentage will be shown in green, otherwise, it will be shown in red.

For other annotation tools (e.g. not classifications), currently, the only way to compare judgments is by opening the "Stage History" drawer from the right side of the screen and clicking between judgments. This will allow reviewers to inspect those judgments.

Last updated