Reviewing labeled data is a collaborative quality assurance technique. To reach state-of-the-art model performance, it often takes a dataset with hundreds-of-thousands to millions of quality labels. In which case, reviewing a significant percentage of those labels may very well require a team of reviewers working simultaneously to ensure quality. The Labelbox review tooling ensures that the collaborative review step is streamlined and transparent.
The review queue automatically distributes labeled images and tracks progress.
The review queue follows the following work distribution rules to streamline collaboration.
- To ensure that labeling and reviewing operations can happen concurrently, only images that have been labeled, categorized, or skipped are entered into the review queue.
- To ensure that the reviewers' work does not overlap, all images in a reviewer's queue are unique.
Turn on review in Settings>Quality and choose the percentage of your project's images that you would like reviewed.
The project overview in Labelbox shows the progress of your overall project across label and review operations. The review section tracks the remaining number of images to be reviewed, the number of thumbs up reviews, and the number of thumbs down reviews.
Of the 41 images uploaded to this project, 87% have been labeled. The review was set to 50% (ie. total of 21 labels). 16 images are left to be reviewed. Of the 5 images, 2 scored poorly and 3 scored positively.
The labeling and review queues are entered via separate tabs (Start Labeling and Review Labels, respectively). The interface for each mode is optimized to support maximum efficiency within each type of activity.