Companion Publication of the 2021 International Conference on Multimodal Interaction 2021
DOI: 10.1145/3461615.3485442
|View full text |Cite
|
Sign up to set email alerts
|

Non-Verbal behaviors analysis of healthcare professionals engaged with a Virtual-Patient

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…The facial Action Units (AUs) are automatically extracted by OpenFace (Baltrusaitis et al, 2018) while the features related to selftouching on the head, proxemics and body openness are computed from the Body and Hand pose estimation using OpenPose (Cao et al, 2021). This set of non-verbal cues is then transformed into symbols and evaluated throughout the interaction (for more details see Zagdoun et al, 2021) to obtain explainable measures that could be indicators of openness to others, warmth, or empathy (e.g., duration of the clinician's smiles, physical proximity or body opening during the interaction, see Mast and Hall (2017)) or discomfort or stress exhibited by the clinician (e.g., number of times the clinician touches their head, see Harrigan (1985). For the feedback, the symbols are contextualized by considering the behaviors of the virtual patient in order to assess the consistency of the clinician's behaviors (e.g., smiling when approaching the patient, do not appear nervous with overgesturing or excessive self-touching).…”
Section: Virtualz-a Vp Tool For Training Clinician To Communicate Wit...mentioning
confidence: 99%
“…The facial Action Units (AUs) are automatically extracted by OpenFace (Baltrusaitis et al, 2018) while the features related to selftouching on the head, proxemics and body openness are computed from the Body and Hand pose estimation using OpenPose (Cao et al, 2021). This set of non-verbal cues is then transformed into symbols and evaluated throughout the interaction (for more details see Zagdoun et al, 2021) to obtain explainable measures that could be indicators of openness to others, warmth, or empathy (e.g., duration of the clinician's smiles, physical proximity or body opening during the interaction, see Mast and Hall (2017)) or discomfort or stress exhibited by the clinician (e.g., number of times the clinician touches their head, see Harrigan (1985). For the feedback, the symbols are contextualized by considering the behaviors of the virtual patient in order to assess the consistency of the clinician's behaviors (e.g., smiling when approaching the patient, do not appear nervous with overgesturing or excessive self-touching).…”
Section: Virtualz-a Vp Tool For Training Clinician To Communicate Wit...mentioning
confidence: 99%