Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems 2020
DOI: 10.1145/3313831.3376448
|View full text |Cite
|
Sign up to set email alerts
|

Transparency of CHI Research Artifacts: Results of a Self-Reported Survey

Abstract: Several fields of science are experiencing a ""replication crisis"" that has negatively impacted their credibility. Assessing the validity of a contribution via replicability of its experimental evidence and reproducibility of its analyses requires access to relevant study materials, data, and code. Failing to share them limits the ability to scrutinize or build-upon the research, ultimately hindering scientific progress.Understanding how the diverse research artifacts in HCI impact sharing can help produce in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
56
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(59 citation statements)
references
References 57 publications
0
56
1
Order By: Relevance
“…Recognition of the above problems and calls to action have permeated human-computer interaction (HCI) [8,12,55,58] and connected fields, including computer science [7], health informatics [9], graphics and visualization [4,30], and computing education [1].…”
Section: Open Science and Hcimentioning
confidence: 99%
“…Recognition of the above problems and calls to action have permeated human-computer interaction (HCI) [8,12,55,58] and connected fields, including computer science [7], health informatics [9], graphics and visualization [4,30], and computing education [1].…”
Section: Open Science and Hcimentioning
confidence: 99%
“…From eight missing responses, seven papers reported partial data, which are included in this meta-analysis. In a survey [41], the authors have found that some researchers are not willing to share the data and information. It could be the case that some of the contacted authors had a similar opinion.…”
Section: Reporting Practices Data Availability and Email Availabilitymentioning
confidence: 99%
“…I acknowledge that my work has the following three limitations. First, since there is no common practice of preregistration in the HCI community, it is almost impossible to analyze the results of unpublished studies [4,41]. Therefore, I could not perform the publication bias assessment.…”
Section: Limitationsmentioning
confidence: 99%
“…Wacharamanotham et al inspected the low availability of artifacts in the HCI community and found that four factors influence researchers to refrain from sharing artifacts: concern about personally-identifiable data, lack of participant's permission, lack of motivation, resources, or recognition, and doubt in the usefulness of their artifact outside their own study [24]. Dahlgren conducted an observatory study during the OOPSLA 2019 artifact evaluation and found that the most prominent negative comments during artifact review are due to limited physical resources or review time to test artifacts and problems with documentation [8].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Contrary to the ACM guidelines, 61 out of 79 analyzed calls for artifacts explicitly state a purpose for artifact evaluation. Across all analyzed calls, the most frequently named purpose of artifact evaluation processes is to enable reuse of artifacts (32 calls 6 ), followed by reproducibility (24) and enabling comparison (17) for future research against published results. When divided by community, programming languages conferences 7 named reproducibility (21) more often than reuse.…”
Section: Calls For Artifacts (Cfas)mentioning
confidence: 99%