2016
DOI: 10.18608/jla.2015.23.3
|View full text |Cite
|
Sign up to set email alerts
|

LATUX: an Iterative Workflow for Designing, Validating and Deploying Learning Analytics Visualisations

Abstract: Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now need frameworks that meet the specific demands of the cross-disciplinary space defined by learning analytics are needed. In particular, LAK needs a systematic workflow… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 50 publications
(39 citation statements)
references
References 43 publications
(39 reference statements)
0
39
0
Order By: Relevance
“…In response, there has been a growing interest in collaborating with educational stakeholders early on in the design of writing analytics tools, and learning analytics tools in general (e.g. Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Dollinger, Liu, Arthars, & Lodge, 2019;Martinez-Maldonado et al, 2016;Wise & Jung, 2019) By including information from writing specialists to identify why and how particular affordances are needed, rather than simply including all features that are technically feasible, the design could be improved (Cotos, 2015). This way, the design can also be better tuned to the educational context (Conde & Hern andez-Garc ıa, 2015).…”
Section: Data About Writing Processesmentioning
confidence: 99%
“…In response, there has been a growing interest in collaborating with educational stakeholders early on in the design of writing analytics tools, and learning analytics tools in general (e.g. Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019;Dollinger, Liu, Arthars, & Lodge, 2019;Martinez-Maldonado et al, 2016;Wise & Jung, 2019) By including information from writing specialists to identify why and how particular affordances are needed, rather than simply including all features that are technically feasible, the design could be improved (Cotos, 2015). This way, the design can also be better tuned to the educational context (Conde & Hern andez-Garc ıa, 2015).…”
Section: Data About Writing Processesmentioning
confidence: 99%
“…Each dashboard builds upon findings of the previous, taking into account the stakeholders and the specific learning context in which it will be deployed. They are built as low-fidelity prototypes at first, with four high-fidelity dashboard prototypes deployed in authentic settings during pilot studies [25]. The dashboards were developed using web technologies such as D3.js, Processing.js, and Node.js.…”
Section: Deployed Dashboardsmentioning
confidence: 99%
“…Although the framework is an excellent thinking tool, to evaluate the impact of a dashboard (e.g., see (Molenaar and van Campen, 2017), it mainly captures the evaluaEon part, and it does not provide a full model of how to design dashboards guiding the whole process from domain characterizaEon to evaluaEon. Another welcome excepEon is the four stages workflow (problem idenEficaEon, lowfidelity prototyping, high-fidelity prototyping, pilot studies) by MarEnez-Maldonaldo et al (2015) to guide the design and deployment of awareness tools for instructors and students. However, the workflow does not capture the principles of visualizaEons nor the challenges to tackle while designing dashboards.…”
Section: Teachers' Dashboards Design Researchmentioning
confidence: 99%