Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems 2020
DOI: 10.1145/3313831.3376380
|View full text |Cite
|
Sign up to set email alerts
|

Decipher: An Interactive Visualization Tool for Interpreting Unstructured Design Feedback from Multiple Providers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 32 publications
0
2
0
Order By: Relevance
“…Another consideration is how feedback should be collected and provided to students. As is evident in a number of crowd feedback systems (e.g., [29,41,43]), visualization and aggregation of feedback supports the feedback receiver in making sense of the feedback. Research on mechanisms for aggregating crowdsourced design feedback is only recently emerging (e.g., [43]).…”
Section: Recommendations For Crowdsourcing Design Feedback In the Classroommentioning
confidence: 97%
See 1 more Smart Citation
“…Another consideration is how feedback should be collected and provided to students. As is evident in a number of crowd feedback systems (e.g., [29,41,43]), visualization and aggregation of feedback supports the feedback receiver in making sense of the feedback. Research on mechanisms for aggregating crowdsourced design feedback is only recently emerging (e.g., [43]).…”
Section: Recommendations For Crowdsourcing Design Feedback In the Classroommentioning
confidence: 97%
“…As is evident in a number of crowd feedback systems (e.g., [29,41,43]), visualization and aggregation of feedback supports the feedback receiver in making sense of the feedback. Research on mechanisms for aggregating crowdsourced design feedback is only recently emerging (e.g., [43]). Our study found that students in particular valued diversity in the responses and appreciated the direct contrast between the feedback from the two sources.…”
Section: Recommendations For Crowdsourcing Design Feedback In the Classroommentioning
confidence: 97%
“…Textos relativos à interac ¸ão com o sistema podem ser produzidos pelos usuários, como: palavras utilizadas em campos de busca [Ruotsalo et al 2018], extraídas dos me-tadados de páginas visitadas [Du et al 2018] ou utilizadas em uma sessão de card-sorting [Paul 2014]; avaliac ¸ões de um software [Yen et al 2020]; transcric ¸ões de comentários do usuário [Sykownik et al 2019]; e anotac ¸ões feitas pelo usuário durante o uso do software [Goodell et al 2006]. Métricas sobre UX, satisfac ¸ão e eficiência foram coletadas utilizando questionários sobre a facilidade de uso e a atratividade estética do software [Bernhaupt et al 2020, Bernhaupt et al 2019, Dittrich et al 2019, Lachner et al 2016.…”
Section: Quais Dados São Explorados Pelas Visualizac ¸õEs?unclassified
“…movimentac ¸ão do usuário e objetos próximos) [Büschel et al 2021, Ebel et al 2021] e identificar o perfil do usuário a partir de dados demográficos e da interac ¸ão [Chen et al 2019]. Por fim, produzir anotac ¸ões para identificar eventos e codificar problemas de UX é encontrado em algumas visualizac ¸ões [Liu and Eagan 2021, Segura et al 2018, Yen et al 2020].…”
Section: Quais Os Propósitos Das Visualizac ¸õEs?unclassified