Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications 2019
DOI: 10.1117/12.2519085
|View full text |Cite
|
Sign up to set email alerts
|

Improving motion sickness severity classification through multi-modal data fusion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
15
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(16 citation statements)
references
References 11 publications
1
15
0
Order By: Relevance
“…Again, we compare the baseline corrected mean values of ( G exp ) and ( G ctrl ), with both time windows being 110 s. A normal respiration rate is shown in our results with no significant difference between ( G exp ) and ( G ctrl ) ( p = .36, t = −0.92). This seems to be consistent with previous research, which has indicated that respiration quality and rhythm relates to susceptibility to MS but not to occurrence or severity 34 . Second, respiration stability was assessed by analyzing the power spectrum using Welch's method.…”
Section: Results and Analysissupporting
confidence: 88%
“…Again, we compare the baseline corrected mean values of ( G exp ) and ( G ctrl ), with both time windows being 110 s. A normal respiration rate is shown in our results with no significant difference between ( G exp ) and ( G ctrl ) ( p = .36, t = −0.92). This seems to be consistent with previous research, which has indicated that respiration quality and rhythm relates to susceptibility to MS but not to occurrence or severity 34 . Second, respiration stability was assessed by analyzing the power spectrum using Welch's method.…”
Section: Results and Analysissupporting
confidence: 88%
“…Critically, the majority of these analyses examine data in a unimodal fashion, where changes in a single sensor system are compared with changes in the level of reported sickness during the experiment, or more commonly, on the SSQ after the experiment has concluded and the user has removed themselves from the XR environment (Dennison & D'Zmura, 2017;Dennison et al, 2016Dennison et al, , 2016Lin et al, 2007). Recent work has demonstrated that the analysis and subsequent extraction of meaningful features from multimodal data can be used to train machine learning models that classify cybersickness severity over time (Dennison et al, 2019). This work has shown that models trained on multimodal data outperform models constructed on only features garnered from unimodal sensor data in the majority of cases.…”
Section: Notedmentioning
confidence: 99%
“…Future studies with a larger sample, various types of VR hardware, and VR software with substantially more diverse features will offer further insights on the impact of software features on VRISE intensity, as well as provide additional support for the VRNQ’s structural model. Lastly, neuroimaging (e.g., electroencephalography) and physiological data (e.g., heart rates) may correlate, classify, and predict VRISE symptomatology (Kim et al, 2005; Dennison et al, 2016, 2019). Hence, future studies should consider collecting neuroimaging and/or physiological data that could further elucidate the relationship between VRNQ’s VRISE score(s) and brain region activation or cardiovascular responses (e.g., heart rate).…”
Section: Discussionmentioning
confidence: 99%