What does reliability mean for building a grounded theory? What about when writing an auto-ethnography? When is it appropriate to use measures like inter-rater reliability (IRR)? Reliability is a familiar concept in traditional scientific practice, but how, and even whether to establish reliability in qualitative research is an oft-debated question. For researchers in highly interdisciplinary fields like computer-supported cooperative work (CSCW) and human-computer interaction (HCI), the question is particularly complex as collaborators bring diverse epistemologies and training to their research. In this article, we use two approaches to understand reliability in qualitative research. We first investigate and describe local norms in the CSCW and HCI literature, then we combine examples from these findings with guidelines from methods literature to help researchers answer questions like: "should I calculate IRR?" Drawing on a meta-analysis of a representative sample of CSCW and HCI papers from 2016-2018, we find that authors use a variety of approaches to communicate reliability; notably, IRR is rare, occurring in around 1/9 of qualitative papers. We reflect on current practices and propose guidelines for reporting on reliability in qualitative research using IRR as a central example of a form of agreement. The guidelines are designed to generate discussion and orient new CSCW and HCI scholars and reviewers to reliability in qualitative research.
AndreA Forte is a Ph.D. candidate in Human-centered computing in the School of Interactive computing at the Georgia Institute of technology. She holds an MLIS from the university of texas at Austin. Her research examines how social technologies support knowledge production and information sharing. Her work has been published in the areas of computer-supported cooperative work, online communities, and the learning sciences.VAnessA LArco is a Program Manager on the Microsoft Surface team, where she works on designing and developing software for multitouch vision-based systems. She holds a B.S. in computer science from the Georgia Institute of technology. She has an interest in understanding how productive groups organize themselves in technologically mediated spaces.Amy BruckmAn is an Associate Professor in the School of Interactive computing at the Georgia Institute of technology. Dr. Bruckman received her Ph.D. from the MIt Media Lab's Epistemology and Learning group in 1997, her MSVS from the Media Lab's Interactive cinema Group in 1991, and her B.A. in physics from Harvard university in 1987. In 1999, she was named one of the 100 top young innovators in science and technology in the world (tr100) by Technology Review magazine. In 2002, she was awarded the Jan Hawkins Award for Early career contributions to Humanistic research and Scholarship in Learning technologies. ABstrAct: How does "self-governance" happen in Wikipedia? through in-depth interviews with 20 individuals who have held a variety of responsibilities in the Englishlanguage Wikipedia, we obtained rich descriptions of how various forces produce and regulate social structures on the site. Although Wikipedia is sometimes portrayed as lacking oversight, our analysis describes Wikipedia as an organization with highly refined policies, norms, and a technological architecture that supports organizational ideals of consensus building and discussion. We describe how governance on the site is becoming increasingly decentralized as the community grows and how this is predicted by theories of commons-based governance developed in offline contexts. We also briefly examine local governance structures called WikiProjects through the example of WikiProject Military History, one of the oldest and most prolific projects on the site.
Seeking and providing support is challenging. When people disclose sensitive information, audience responses can substantially impact the discloser's wellbeing. We use mixed methods to understand
responses to
online sexual abuse-related disclosures on Reddit. We characterize disclosure responses, then investigate relationships between post content, comment content, and anonymity. We illustrate what types of support sought and provided in posts and comments co-occur. We find that posts seeking support receive more comments, and comments from “throwaway” (i.e., anonymous) accounts are more likely on posts also from throwaway accounts. Anonymous commenting enables commenters to share intimate content such as reciprocal disclosures and supportive messages, and commenter anonymity is not associated with aggressive or unsupportive comments. We argue that anonymity is an essential factor in designing social technologies that facilitate support seeking
and
provision in socially stigmatized contexts, and provide implications for social media site design. CAUTION: This article includes content about sexual abuse.
Abstract-Traditional introductory computer science (CS) courses have had little success engaging non-computer science majors. At Georgia Institute of Technology, where introductory computer science courses are a requirement for CS majors and nonmajors alike, two tailored introductory courses were introduced as an alternative to the traditional course. The results were encouraging: more non-majors succeeded (completed and passed) in tailored courses than in the traditional course, students expressed fewer negative reactions to the course content, and many reported that they would be interested in taking another tailored CS course. The authors present findings from a pilot study of the three courses and briefly discuss some of the issues surrounding the tailored courses for non-majors: programming, context, choice of language, and classroom culture.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.