2008
DOI: 10.1007/s10606-008-9080-9
|View full text |Cite
|
Sign up to set email alerts
|

The CACHE Study: Group Effects in Computer-supported Collaborative Analysis

Abstract: The present experiment investigates effects of group composition in computersupported collaborative intelligence analysis. Human cognition, though highly adaptive, is also quite limited, leading to systematic errors and limitations in performance -that is, biases. We experimentally investigated the impact of group composition on an individual's bias, by composing groups that differ in whether their members initial beliefs are diverse (heterogeneous group) or similar (homogeneous group). We study three-member, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
42
0

Year Published

2009
2009
2020
2020

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 41 publications
(43 citation statements)
references
References 49 publications
(53 reference statements)
0
42
0
Order By: Relevance
“…potentially plausible "narratives", in our earlier terminology). ACH--based tools clearly scaffold knowledge--building discourse in a disciplined way, for instance, CACHE (Convertino et al 2008;Shrager, et al 2010), provides a collaborative ACH space for the exploration of hypotheses in open science, uses notification spreading through provenance chains in order to simplify revision updating, marking questionable results, and informing scientists when their hypothesis or claims potentially need to be revised. Interestingly, just as intelligence analysis tools such as ACH provide support for detailed analysis of competing options that might be expressed in a collaborative discourse platform such as Cohere, we note that Smallman (2008, p.334) notes the need within the intelligence analysis community for better support in argument analysis and visualization, of the sort provided by Cohere, or Rationale (van Gelder 2002).…”
Section: The Concept Of Contested Collective Intelligencementioning
confidence: 99%
“…potentially plausible "narratives", in our earlier terminology). ACH--based tools clearly scaffold knowledge--building discourse in a disciplined way, for instance, CACHE (Convertino et al 2008;Shrager, et al 2010), provides a collaborative ACH space for the exploration of hypotheses in open science, uses notification spreading through provenance chains in order to simplify revision updating, marking questionable results, and informing scientists when their hypothesis or claims potentially need to be revised. Interestingly, just as intelligence analysis tools such as ACH provide support for detailed analysis of competing options that might be expressed in a collaborative discourse platform such as Cohere, we note that Smallman (2008, p.334) notes the need within the intelligence analysis community for better support in argument analysis and visualization, of the sort provided by Cohere, or Rationale (van Gelder 2002).…”
Section: The Concept Of Contested Collective Intelligencementioning
confidence: 99%
“…Such sharing can improve both process and performance in collaborative sensemaking and analysis tasks, measured by accuracy of task outcomes [19,39] and recall of decisions [13]. Sharing can also help analysts flexibly organize data to discover insights and form schemas [39,40] or concept maps [15], identify and link evidence from different sources [4,6], identify entities in the data [9], and encounter otherwise hidden and overlooked connections between pieces of information [1,9,11,25], leading to better decisions [19]. The success of shared workspaces in team performance has been attributed to promoting exchange of information and data with others [19], as a result improved common ground [39] and awareness of the status of the analysis task and others' activities in the task [13,34].…”
Section: Background: Information Sharing In Collaborative Analysismentioning
confidence: 99%
“…This is a general problem for evaluating collaborative analysis tools: the high stakes and established processes of real investigations makes it hard to deploy new tools in the field. Thus, tool evaluations tend to involve short-term tasks completed by university students [1,11,12,13,19,23,39]. These controlled contexts and tasks do provide compensating advantages, such as the ability to systematically vary tools in ways that allow researchers to carefully examine specific factors that together build a body of knowledge to inform real world tools design.…”
Section: Limitations and Generalizationmentioning
confidence: 99%
See 2 more Smart Citations