This article presents a review and analysis of the research literature in social Q&A (SQA), a term describing systems where people ask, answer, and rate content while interacting around it. The growth of SQA is contextualized within the broader trend of user-generated content from Usenet to Web 2.0, and alternative definitions of SQA are reviewed. SQA sites have been conceptualized in the literature as simultaneous examples of tools, collections, communities, and complex sociotechnical systems. Major threads of SQA research include user-generated and algorithmic question categorization, answer classification and quality assessment, studies of user satisfaction, reward structures, and motivation for participation, and how trust and expertise are both operationalized by and emerge from SQA sites. Directions for future research are discussed, including more refined conceptions of SQA site participants and their roles, unpacking the processes by which social capital is achieved, managed, and wielded in SQA sites, refining question categorization, conducting research within and across a wider range of SQA sites, the application of economic and game-theoretic models, and the problematization of SQA itself.
The most sustainable online communities are those that allow and encourage their users to have a voice in how the community evolves. The proliferation of online communities with collaborative filtering mechanisms, where user feedback is aggregated to shape future interactions, makes it necessary to understand why participants in online communities value the content they do.Building on the concepts of users as specialists and synthesists developed in previous research, this study examines Answerbag, an online question answering community where users rate one another's answers to provide collaborative filtering. In this environment, specialists are operationalized as those who claim expertise in a given topic and answer questions without referencing other sources, and synthesists as those who include one or more references to external sources in their answers. The results of the study suggest that within the Answerbag community as a whole, the answers of synthesists tended to be rated more highly than those of specialists, though answers provided by specialists were rated more highly within certain categories. The consequences of differences in the perceived value of information provided by specialists and synthesists are examined, and avenues for future research are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.