2018
DOI: 10.1038/s41592-018-0069-0
|View full text |Cite
|
Sign up to set email alerts
|

Quanti.us: a tool for rapid, flexible, crowd-based annotation of images

Abstract: We describe Quanti.us , a crowd-based image-annotation platform that provides an accurate alternative to computational algorithms for difficult image-analysis problems. We used Quanti.us for a variety of medium-throughput image-analysis tasks and achieved 10-50× savings in analysis time compared with that required for the same task by a single expert annotator. We show equivalent deep learning performance for Quanti.us-derived and expert-derived annotations, which should allow scalable integration with tailore… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
40
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 55 publications
(41 citation statements)
references
References 39 publications
1
40
0
Order By: Relevance
“…Alternative methods for collecting image annotations now exist, such as the Quanti.us system (30) for distributing manual annotations to nonexpert workers on the internet. Alternative methods for collecting image annotations now exist, such as the Quanti.us system (30) for distributing manual annotations to nonexpert workers on the internet.…”
Section: Expert Annotationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Alternative methods for collecting image annotations now exist, such as the Quanti.us system (30) for distributing manual annotations to nonexpert workers on the internet. Alternative methods for collecting image annotations now exist, such as the Quanti.us system (30) for distributing manual annotations to nonexpert workers on the internet.…”
Section: Expert Annotationsmentioning
confidence: 99%
“…Our prototype tool was useful to collect nucleus annotations for this research; however, significant development is needed to improve it. Alternative methods for collecting image annotations now exist, such as the Quanti.us system (30) for distributing manual annotations to nonexpert workers on the internet. This may enable the annotation process for new projects to be scaled up quickly to many more images, and, according to their findings, reaching similar precision to experts when multiple workers provide independent annotation replicates.…”
Section: Expert Annotationsmentioning
confidence: 99%
“…For applications that require much greater scale, recruitment of more annotators may be necessary. Several methods for recruiting annotators include paying annotators a small amount of money for each annotation by integrating with Amazon Turk [9], gamifying the app [23], or creating a citizen science effort [1]. We envision that this type of accurately human-curated images will support (by providing ground truth) and complement (in rare and unanticipated scenarios)machine-learning approaches as they become dominant in image-based analyses for many fields of scientific inquiry.…”
Section: Future Perspectivementioning
confidence: 99%
“…Some automated tools require extensive tuning or parameter optimization prior to annotation to enhance accuracy, and many image processing pipelines are not well-suited for heterogeneous image sets. In addition, many tools for human annotation limit the way users can define image features of interest, for example, via rectangles, polygonsor circles [9]. Annotation speed is limited by the complexity of annotation softwareand, ultimately, how quickly annotators can mark phenotypes accurately [10].…”
Section: Introductionmentioning
confidence: 99%
“…hampering efforts to apply deep learning in many biological domains 4 , we opted to use a pragmatic approach, treating the segmentation and classification steps as separate problems (Fig. 1a).…”
mentioning
confidence: 99%