The aim of this study is to develop and assess the peg transfer training module face, content and construct validation use of the box, virtual reality (VR), cognitive virtual reality (CVR), augmented reality (AR), and mixed reality (MR) trainer, thereby to compare advantages and disadvantages of these simulators. Training system (VatsSim-XR) design includes customized haptic-enabled thoracoscopic instruments, virtual reality helmet set, endoscope kit with navigation, and the patient-specific corresponding training environment. A cohort of 32 trainees comprising 24 novices and 8 experts underwent the real and virtual simulators that were conducted in the department of thoracic surgery of Yunnan First People's Hospital. Both subjective and objective evaluations have been developed to explore the visual and haptic potential promotions in peg transfer education. Experiments and evaluation results conducted by both professional and novice thoracic surgeons show that the surgery skills from experts are better than novices overall, AR trainer is able to provide a more balanced training environments on visuohaptic fidelity and accuracy, box trainer and MR trainer demonstrated the best realism 3D perception and surgical immersive performance, respectively, and CVR trainer shows a better clinic effect that the traditional VR trainer. Combining these in a systematic approach, tuned with specific fidelity requirements, medical simulation systems would be able to provide a more immersive and effective training environment.
With the integration of photovoltaic (PV) power into an electrical network, the complexity of the grid management is increasing because of intermittent and fluctuation nature of solar energy. Solar irradiance forecasting is essential to facilitate planning and managing electricity generation and distribution in smart grid cyber-physical system (CPS). The performance of existing short-term forecasting methods is far from satisfactory due to a lack of reliable and fast time-frequency model for continuous-time solar irradiance data. To address this problem, this paper proposes a new method, Elman Neural Network (ENN) driven Wavelet Transform (WT-ENN), for hourly solar irradiance forecasting. Firstly, the solar irradiance series was decomposed into a set of constitutive series using wavelet transform. Secondly, the new wavelet coefficients were predicted by ENNs in every sub-series with the best network structure and parameters. Thirdly, Wavelet reconstruction will predict next hour solar irradiance through the aggregation of outputs of the ensemble of ENNs. Finally, the forecasting performance was evaluated using two large real-world solar irradiance datasets. Experiment results show that the new WT-ENN model outperforms a large number of alternative methods and an average forecast skill of 0.7590 over the persistence model. Thus, it is concluded that the proposed approach can significantly improve the forecasting accuracy and reliability.
Realistic tool-tissue interactive modeling has been recognized as an essential requirement in the training of virtual surgery. A virtual basic surgical training framework integrated with real-time force rendering has been recognized as one of the most immersive implementations in medical education. Yet, compared to the original intraoperative data, there has always been an argument that these data are represented by lower fidelity in virtual surgical training. In this paper, a dynamic biomechanics experimental framework is designed to achieve a highly immersive haptic sensation during the biopsy therapy with human respiratory motion; it is the first time to introduce the idea of periodic extension idea into the dynamic percutaneous force modeling. Clinical evaluation is conducted and performed in the Yunnan First People’s Hospital, which not only demonstrated a higher fitting degree (AVG: 99.36%) with the intraoperation data than previous algorithms (AVG: 87.83%, 72.07%, and 66.70%) but also shows a universal fitting range with multilayer tissue. 27 urologists comprising 18 novices and 9 professors were invited to the VR-based training evaluation based on the proposed haptic rendering solution. Subjective and objective results demonstrated higher performance than the existing benchmark training simulator. Combining these in a systematic approach, tuned with specific fidelity requirements, haptically enabled medical simulation systems would be able to provide a more immersive and effective training environment.
When the input colors of the left and right eyes are different from one another, binocular rivalry may occur. According to Hering theory, opponent colors would have the most significant tendency for rivalry. However, binocular color fusion still occurs under the condition that each eye's opponent chromatic responses do not exceed a specific chromatic fusion limit (CFL). This paper detects the binocular chromatic fusion limit for opposite colors within a conventional 3D display color gamut. We conducted a psychophysical experiment to quantitatively measure the binocular chromatic fusion limit on four opposite color directions in the CIELAB color space. Due to color inconsistency between eyes may affect the binocular color fusion, the experiment was divided into two sessions by swapping stimulation colors of left and right eyes. There were 5 subjects and they each experienced 320 trials. By analyzing the results, we used ellipses to quantify the chromatic fusion limits for opposing colors. The average semi-major axis of the ellipses is 27.55
Δ
E
a
b
∗
, and the average semi-minor axis is 16.98
Δ
E
a
b
∗
. We observed that the chromatic fusion limit varies with the opposite color direction: the CFL on RedBlue-GreenYellow direction is greater than that on Red-Green direction, the latter being greater than that on Yellow-Blue direction and the CFL on RedYellow-GreenBlue direction is smallest. Furthermore, we suggested that the chromatic fusion limit is independent of the distribution of cells, and there is no significant change in the fusion ellipse boundaries after swapping left and right eye colors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.