2020
DOI: 10.3390/s20082208
|View full text |Cite
|
Sign up to set email alerts
|

Closed-Loop Attention Restoration Theory for Virtual Reality-Based Attentional Engagement Enhancement

Abstract: Today, as media and technology multitasking becomes pervasive, the majority of young people face a challenge regarding their attentional engagement (that is, how well their attention can be maintained). While various approaches to improve attentional engagement exist, it is difficult to produce an effect in younger people, due to the inadequate attraction of these approaches themselves. Here, we show that a single 30-min engagement with an attention restoration theory (ART)-inspired closed-loop software progra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(20 citation statements)
references
References 55 publications
1
10
0
Order By: Relevance
“…As future direction, we can list the main recommended aspects that surround and must follow attention detection/investigation through EEG-signal: identify attention allocation through N100, N200, P100 and P300 in fronto-central-occipital brain areas (Vogt et al, 2015;Jaquess et al, 2017); P300 in fronto-parietal brain areas (Rohani and Puthusserypady, 2015;Vogt et al, 2015;Yu et al, 2015;Causse et al, 2016;Chen et al, 2016;Lamti et al, 2016;Jaquess et al, 2017;Chung et al, 2018;Li et al, 2020d) and SSVEP component in occipital area (Leite et al, 2018;Chu and D'Zmura, 2019) as we cited in Figure 3; x. Investigate attention by means of brain waves increase, decrease and beta (12-31.25 Hz)/theta (3-8 Hz) ratio in fronto-parietal brain areas, θ (Clemente et al, 2014;Fuentes-García et al, 2019;Lim et al, 2019), in occipital area, θ (Savage et al, 2013), in frontal area, αβθδ (Clemente et al, 2014;Cowley and Ravaja, 2014;Jagannath and Balasubramanian, 2014;Yin and Zhang, 2014;Lee et al, 2015;Hazarika et al, 2018;Lim et al, 2019;Li et al, 2020d) as we cited in Figure 3.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…As future direction, we can list the main recommended aspects that surround and must follow attention detection/investigation through EEG-signal: identify attention allocation through N100, N200, P100 and P300 in fronto-central-occipital brain areas (Vogt et al, 2015;Jaquess et al, 2017); P300 in fronto-parietal brain areas (Rohani and Puthusserypady, 2015;Vogt et al, 2015;Yu et al, 2015;Causse et al, 2016;Chen et al, 2016;Lamti et al, 2016;Jaquess et al, 2017;Chung et al, 2018;Li et al, 2020d) and SSVEP component in occipital area (Leite et al, 2018;Chu and D'Zmura, 2019) as we cited in Figure 3; x. Investigate attention by means of brain waves increase, decrease and beta (12-31.25 Hz)/theta (3-8 Hz) ratio in fronto-parietal brain areas, θ (Clemente et al, 2014;Fuentes-García et al, 2019;Lim et al, 2019), in occipital area, θ (Savage et al, 2013), in frontal area, αβθδ (Clemente et al, 2014;Cowley and Ravaja, 2014;Jagannath and Balasubramanian, 2014;Yin and Zhang, 2014;Lee et al, 2015;Hazarika et al, 2018;Lim et al, 2019;Li et al, 2020d) as we cited in Figure 3.…”
Section: Discussionmentioning
confidence: 99%
“…Investigate attention by means of ERP components to identify attention allocation through N100, N200, P100 and P300 in fronto-central-occipital brain areas ( Vogt et al, 2015 ; Jaquess et al, 2017 ); P300 in fronto-parietal brain areas ( Rohani and Puthusserypady, 2015 ; Vogt et al, 2015 ; Yu et al, 2015 ; Causse et al, 2016 ; Chen et al, 2016 ; Lamti et al, 2016 ; Jaquess et al, 2017 ; Chung et al, 2018 ; Li et al, 2020d ) and SSVEP component in occipital area ( Leite et al, 2018 ; Chu and D’Zmura, 2019 ) as we cited in Figure 3 ;…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Thus, a multimodal objective measurement can be planned for future study. This is especially pertinent given the progress on AI-based physiological feature learning for VR application [29] and highly-integrated research-grade [30] or commercial [31] VR-biosensing platforms.…”
Section: Limitation and Conclusionmentioning
confidence: 99%