Virtual reality, augmented reality, and mixed reality (VR/AR/MR) as information and communication technologies have been recognised and implemented in healthcare in recent years. One of the popular application ways is games, due to the potential benefits of providing an engaging and immersive experience in a virtual environment. This study presents a systematic literature review that evaluates the state-of-the-art on VR/AR/MR game applications in healthcare by collecting and analysing related journal and conference papers published from 2014 through to the first half of 2020. After retrieving more than 3,000 papers from six databases, 88 articles, from both computer science and medicine, were selected and analysed in the review. The articles are classified and summarised based on their (1) publication information, (2) design, implementation, and evaluation, and (3) application. The presented review is beneficial for both researchers and developers interested in exploring current research and future trends in VR/AR/MR in healthcare.
In the wake of the restrictions imposed on social interactions due to the COVID-19 pandemic, traditional classroom education was replaced by distance education in many universities. Under the changed circumstances, students are required to learn more independently. The challenge for teachers has been to duly ascertain students’ learning efficiency and engagement during online lectures. This paper proposes an optimized lightweight convolutional neural network (CNN) model for engagement recognition within a distance-learning setup through facial expressions. The ShuffleNet v2 architecture was selected, as this model can easily adapt to mobile platforms and deliver outstanding performance compared to other lightweight models. The proposed model was trained, tested, evaluated and compared with other CNN models. The results of our experiment showed that an optimized model based on the ShuffleNet v2 architecture with a change of activation function and the introduction of an attention mechanism provides the best performance concerning engagement recognition. Further, our proposed model outperforms many existing works in engagement recognition on the same database. Finally, this model is suitable for student engagement recognition for distance learning on mobile platforms.
Abstract-eHealth is an emerging area that boosts up with advancement in Information and Communication Technology (ICT). Due to variety of eHealth solutions developed by different IT firms with no unified standards, interoperability issue has raised. In this paper, a case study in Blekinge County healthcare organizations has been conducted for understanding the contexts of eHealth interoperability issues. Then a peer-to-peer (P2P) model based on JXTA platform is implemented to solve the identified eHealth interoperability problems. According to the test result of the prototype, the suggested syntactic level interoperability among healthcare organizations has been achieved.
Due to the advances in head-mounted displays (HMDs), hardware and software technologies, and mobile connectivity, virtual reality (VR) applications such as viewing 360∘ videos on HMDs have seen an increased interest in a wide range of consumer and vertical markets. Quality assessment of digital media systems and services related to immersive visual stimuli has been one of the challenging problems of multimedia signal processing. Specifically, subjective quality assessment of 360∘ videos presented on HMDs is needed to obtain a ground truth on the visual quality as perceived by humans. Standardized test methodologies to assess the subjective quality of 360∘ videos on HMDs are currently not as developed as for conventional videos and are subject to further study. In addition, subjective tests related to quality assessment of 360∘ videos are commonly conducted with participants seated on a chair but neglect other options of consumption such as standing viewing. In this paper, we compare the effect that standing and seated viewing of 360∘ videos on an HMD has on subjective quality assessment. A pilot study was conducted to obtain psychophysical and psychophysiological data that covers explicit and implicit responses of the participants to the shown 360∘ video stimuli with different quality levels. The statistical analysis of the data gathered in the pilot study is reported in terms of average rating times, mean opinion scores, standard deviation of opinion scores, head movements, pupil diameter, galvanic skin response (GSR), and simulator sickness scores. The results indicate that the average rating times consumed for 360∘ video quality assessment are similar for standing and seated viewing. Further, the participants showed higher resolving power among different 360∘ video quality levels and were more confident about the given opinion scores for seated viewing. On the other hand, a larger scene exploration of 360∘ videos was observed for standing viewing which appears to distract from the quality assessment task. A slightly higher pupil dilation was recorded for standing viewing which suggests a slightly more immersed experience compared to seated viewing. GSR data indicate a lower degree of emotional arousal in seated viewing which seems to allow the participants to better conduct the quality assessment task. Similarly, simulator sickness symptoms are kept significantly lower when seated. The pilot study also contributes to a holistic view of subjective quality assessment and provides indicative ground truth that can guide the design of large-scale subjective tests.
Background Many studies have found an interesting issue in the Internet gaming disorder (IGD): males are always observed to be the majority. However, there are little research to exploring the differences in the neural mechanisms between males and females in decision-making process among people with IGD. Therefore, explore the reward/loss processing between different gender with IGD could help in understanding the underlying neural mechanism of IGD. Methods Data from functional magnetic resonance imaging (fMRI) were collected from 111 subjects (IGD: 29 males, 25 females; recreational internet game user (RGU): 36 males, 21 females) while they were performing a card-guessing task. We collected and compared their brain features when facing the win and loss conditions in different groups. Results For winning conditions, IGD group showed hypoactivity in the lingual gyrus than RGU group, male players showed hyperactivity in the left caudate nucleus, bilateral cingulate gyrus, right middle frontal gyrus (MFG), right precuneus and inferior parietal lobule relative to the females. And significant sex-by-group interactions results showed higher brain activities in the thalamus, parahippocampal gyrus and lower brain activities in Inferior frontal gyrus (IFG) were observed in males with IGD than females. For losing conditions, IGD group showed hypoactivity in the left lingual gyrus, parahippocampal gyrus and right anterior cingulate cortex (ACC) compared to the RGU group, male players showed hyperactive left caudate nucleus and hypoactive right middle occipital gyrus relative to females. And significant sex-by-group interactions results showed that compared to females with IGD, males with IGD showed decreased brain activities in the IFG and lingual gyrus. Conclusions First, there appeared to be no difference in reward processing between the IGD and RGU group, but IGD showed less sensitivity to loss. Secondly, male players showed more sensitivity to rewards and less sensitivity to losses. Last but not least, males and females showed opposite activation patterns in IGD degree and rewards/losses processing. And male IGD subjects are more sensitive to reward and less sensitive to loss than females, which might be the reason for the gender different rates on IGD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.