2019
DOI: 10.1007/978-3-030-06076-3_8
|View full text |Cite
|
Sign up to set email alerts
|

Web-Based Embodied Conversational Agents and Older People

Abstract: Within Human-Computer Interaction, there has recently been an important turn to embodied and voice-based interaction. In this chapter, we discuss our ongoing research on building online Embodied Conversational Agents (ECAs), specifically, their interactive 3D web graphics aspects. We present ECAs based on our technological pipeline, which integrates a number of free online editors, such as Adobe Fuse CC or MakeHuman, and standards, mainly BML (Behaviour Markup Language). We claim that making embodiment availab… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 38 publications
0
4
0
Order By: Relevance
“…One approach to closing the loop may be to use acted conversations (e.g., Beechey et al 2019 , 2020 ) in combination with virtual interactive animated characters in the virtual reality lab. This could be done via embodied conversational agents ( Llorach et al 2019 ), which are animated characters in the virtual reality that are controlled by a virtual conversational agent (e.g., Llorach & Blat 2017 ) or an actor (interactive puppeteering, e.g., Husinsky & Bruckner 2018 ). Another potential way to close the loop is to combine virtual-reality glasses with an omnidirectional treadmill, which may increase the interactivity and involvement of the subject while they perform hearing-related tasks.…”
Section: General Discussion and Conclusionmentioning
confidence: 99%
“…One approach to closing the loop may be to use acted conversations (e.g., Beechey et al 2019 , 2020 ) in combination with virtual interactive animated characters in the virtual reality lab. This could be done via embodied conversational agents ( Llorach et al 2019 ), which are animated characters in the virtual reality that are controlled by a virtual conversational agent (e.g., Llorach & Blat 2017 ) or an actor (interactive puppeteering, e.g., Husinsky & Bruckner 2018 ). Another potential way to close the loop is to combine virtual-reality glasses with an omnidirectional treadmill, which may increase the interactivity and involvement of the subject while they perform hearing-related tasks.…”
Section: General Discussion and Conclusionmentioning
confidence: 99%
“…In conventional practices, designers rely on specialized software to create 3D assets, which are subsequently integrated into applications by developers [15], [19], [26]. Improvements have been made in recent years to enhance this approach by focusing on the release of custom software to support the modeling or creation of 3D assets.…”
Section: Related Workmentioning
confidence: 99%
“…With ECAs, which are due to the interplay of visual and auditory interface design even more complex to implement, there is comparatively less guidance. However, there are avatar model “construction kits”, animation libraries, and lip-sync plugins available that can be integrated into tools commonly used in game development (e.g., “Unity”) to avoid starting from scratch (see Llorach et al 2019 ). The avatar can then be built as web-, desktop-, mobile- or even as an application for virtual/augmented reality (VR/AR) glasses and controlled in real-time.…”
Section: Related Work and Opportunities For Future Researchmentioning
confidence: 99%