2016
DOI: 10.1177/155019061601200213
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing as Practice and Method in the Smithsonian Transcription Center

Abstract: This article employs qualitative research design and methods to examine volunteer motivation and continued experiences of participation in the Smithsonian Institution's Transcription Center (TC), a large-scale crowdsourcing project and space for engagement with collections, Smithsonian Institution staff, and peer volunteers, or volunpeers. Data were obtained from two focus groups conducted on August 24 and 25, 2015. Following these discussions, an experimental method of crowdsourced authorship was developed by… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
1
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 16 publications
1
1
0
Order By: Relevance
“…(3) the materials are unique and of cultural value: and (4) participating in a team effort is rewarding. These results largely concur with those observed by other researchers (Ferriter et al, 2016;Oomen & Aroyo, 2011;Raddick et al, 2009). But the homogeneity of the responses also corresponds to the ideas and incentives featured in publicity about CWW on the radio and TV, online, and in print, suggesting that the publicity materials were read and remembered by the participants.…”
Section: Deeper Meaning In Crowdsourcing 193supporting
confidence: 91%
See 1 more Smart Citation
“…(3) the materials are unique and of cultural value: and (4) participating in a team effort is rewarding. These results largely concur with those observed by other researchers (Ferriter et al, 2016;Oomen & Aroyo, 2011;Raddick et al, 2009). But the homogeneity of the responses also corresponds to the ideas and incentives featured in publicity about CWW on the radio and TV, online, and in print, suggesting that the publicity materials were read and remembered by the participants.…”
Section: Deeper Meaning In Crowdsourcing 193supporting
confidence: 91%
“…Quantitative methods, namely, participant surveys and log data analysis in combination with other indirect methods, such as exploration of user profile pages (Lane, 2017;Raddick et al, 2009;Wood et al, 2011), are used most frequently to explore user involvement. Focus groups are an additional and productive way to collect more individualized points of view (Ferriter et al, 2016).…”
Section: Sanita Reinsonementioning
confidence: 99%