2022
DOI: 10.1177/1071181322661537
|View full text |Cite
|
Sign up to set email alerts
|

Joint Activity Testing: Towards a Multi-Dimensional, High-Resolution Evaluation Method for Human-Machine Teaming

Abstract: Quantitative evaluations of human-machine teams (HMTs) are desperately needed to ensure technological implementations are helpful rather than harmful to overall system performance; however, as machines increasingly behave like active cognitive teammates, traditional evaluation strategies risk overestimating HMT capabilities. Areliable HMT evaluation method should include multiple high-resolution, continuous measures for both system performance and system challenges that can be implemented unobtrusively in real… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 20 publications
(28 reference statements)
0
2
0
Order By: Relevance
“…System-level user requirement studies can also inform the integration process ( 147 , 148 ). Additionally, cognitive systems engineering research could be an area that provides valuable quantitative evaluations on how computational tools integrate into domain expert workflows, such as the recently proposed joint activity testing framework ( 152 ). The need for more user-friendly kinematics measurement, segmentation, and analysis methods, as well as investigating how to integrate kinematic analyses into domain expert workflows (i.e., human factors), underscores the multidisciplinary approach required to meaningfully improve the quality and administration of UEFAs.…”
Section: Discussionmentioning
confidence: 99%
“…System-level user requirement studies can also inform the integration process ( 147 , 148 ). Additionally, cognitive systems engineering research could be an area that provides valuable quantitative evaluations on how computational tools integrate into domain expert workflows, such as the recently proposed joint activity testing framework ( 152 ). The need for more user-friendly kinematics measurement, segmentation, and analysis methods, as well as investigating how to integrate kinematic analyses into domain expert workflows (i.e., human factors), underscores the multidisciplinary approach required to meaningfully improve the quality and administration of UEFAs.…”
Section: Discussionmentioning
confidence: 99%
“…JAT has now also been used to evaluate and compare the performance of teams with and without AI or advanced analytics, and has proved to be valuable even in high-stakes, high-uncertainty settings with only a small number (10-15) of participants (Morey et. al., 2020;Morey et. al.…”
Section: Using Joint Activity Testing To Compare Team Performance Wit...mentioning
confidence: 99%