12th Asia-Pacific Software Engineering Conference (APSEC'05) 2005
DOI: 10.1109/apsec.2005.22
|View full text |Cite
|
Sign up to set email alerts
|

A user evaluation of synchronous collaborative software engineering tools

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 10 publications
0
6
0
Order By: Relevance
“…Participants who attempted and/or completed the task were asked to fill out a survey about their experience, including the development environment they used, how much time they spent on each sub-task, and which activities they did and functionality they used as part of the task and which of these were most important. They were also asked to provide open-ended feedback on different aspects, and to report how demanding the task was using the NASA-TLX Task Load Index [38], a workload assessment that is widely used in usability evaluations in software engineering and other domains [22,75]. Participants indicate on a scale the temporal demand, mental demand, and effort required by the task, their perceived performance, and their frustration.…”
Section: General Proceduresmentioning
confidence: 99%
“…Participants who attempted and/or completed the task were asked to fill out a survey about their experience, including the development environment they used, how much time they spent on each sub-task, and which activities they did and functionality they used as part of the task and which of these were most important. They were also asked to provide open-ended feedback on different aspects, and to report how demanding the task was using the NASA-TLX Task Load Index [38], a workload assessment that is widely used in usability evaluations in software engineering and other domains [22,75]. Participants indicate on a scale the temporal demand, mental demand, and effort required by the task, their perceived performance, and their frustration.…”
Section: General Proceduresmentioning
confidence: 99%
“…Participants who completed or attempted the task were asked to complete a survey about their experience, including the development environment they used, time spent by sub-task, activities/functionality they did/used as part of the task and which were most important, and open-ended feedback on different aspects. They were also asked about how demanding the task was using the NASA-TLX Task Load Index (Hart & Staveland, 1988), a workload assessment that is widely used in usability evaluations in software engineering and other domains (Cook et al, 2005;Salman & Turhan, 2018).…”
Section: Case Study Designmentioning
confidence: 99%
“…Even though, we have not explicitly done an experiment to evaluate the usefulness of group in collaboration, we researched for other works that have evaluate the usefulness of group collaboration. Cook et al [20] presented a user evaluation of collaborative tools for software engineering. They used a post experiment questionnaire to evaluate the experiments qualitative aspects.…”
Section: Usefulness Of Group Collaborationmentioning
confidence: 99%