BACKGROUND The assessment of clinical procedural skills has traditionally focused on technical elements alone. However, in real practice, clinicians are expected to be able to integrate technical with communication and other professional skills. We describe an integrated procedural performance instrument (IPPI), where clinicians are assessed on 12 clinical procedures in a simulated clinical setting which combines simulated patients (SPs) with inanimate models or items of medical equipment. Candidates are observed remotely by assessors whose data are fed back to the clinician within 24 hours of the assessment. This paper describes the feasibility of IPPI.RESULTS A full-scale IPPI and 2 pilot studies with trainee and qualified health care professionals has yielded an extensive data set including 585 scenario evaluations from candidates, 60 from clinical assessors and 31 from simulated patients (SPs). Interview and questionnaire data showed that for the majority of candidates IPPI provided a powerful and valuable learning experience. Realism was rated highly. Remote and real-time assessment worked effectively, although for some procedures limited camera resolution affected observation of fine details.DISCUSSION IPPI offers an innovative approach to assessing clinical procedural skills. Although resource-intensive, it has the potential to provide insight into individual's performance over a spectrum of clinical scenarios and at no risk to the safety of patients. Additional benefits of IPPI include assessment in real time from experts (allowing remote rating by external examiners) as well as provision of feedback from simulated patients.
The art of picking up signs that a child may be suffering from abuse at home is one of those skills that cannot easily be taught, given its dependence on a range of non-cognitive abilities. It is also difficult to study, given the number of factors that may interfere with this skill in a real-life, professional setting. An immersive virtual reality environment provides a way round these difficulties. In this study, we recruited 64 general practitioners (GPs), with different levels of experience. Would this level of experience have any impact on general practitioners' ability to pick up child-safeguarding concerns? Would more experienced GPs find it easier to pick up subtle (rather than obvious) signs of child-safeguarding concerns? Our main measurement was the quality of the note left by the GP at the end of the virtual consultation: we had a panel of 10 (all experienced in safeguarding) rate the note according to the extent to which they were able to identify and take the necessary steps required in relation to the child safeguarding concerns. While the level of professional experience was not shown to make any difference to a GP's ability to pick up those concerns, the parent's level of aggressive behavior toward the child did. We also manipulated the level of cognitive load (reflected in a complex presentation of the patient's medical condition): while cognitive load did have some impact upon GPs in the "obvious cue" condition (parent behaving particularly aggressively), this effect fell short of significance. Furthermore, our results also suggest that GPs who are less stressed, less neurotic, more agreeable and extroverted tend to be better at raising potential child abuse issues in their notes. These results not only point at the considerable potential of virtual reality as a training tool, they also highlight fruitful avenues for further research, as well as potential strategies to support GP's in their dealing with highly sensitive, emotionally charged situations.
This paper presents an application of the CASSM (Conceptbased Analysis of Surface and Structural Misfits) framework to interactive machine learning for a bodily interaction domain. We developed software to enable end users to design full body interaction games involving interaction with a virtual character. The software used a machine learning algorithm to classify postures as based on examples provided by users. A longitudinal study showed that training the algorithm was straightforward, but that debugging errors was very challenging. A CASSM analysis showed that there were fundamental mismatches between the users concepts and the working of the learning system. This resulted in a new design in which aimed to better align both the learning algorithm and user interface with users' concepts. This work provides and example of how HCI methods can be applied to machine learning in order to improve its usability and provide new insights into its use.
This paper presents a system that allows end users to design full body interactions with 3D animated virtual character through a process we call Interactive Performance Capture. This process is embodied in the sense that users design directly by moving and interacting using an interactive machine learning method. Two people improvise an interaction based only on their movements, one plays the part of the virtual character the other plays a real person. Their movements are recorded and they label it with metadata that identifies certain actions and responses. This labelled data is then used to train a Gaussian Mixture Model that is able to recognized new actions and generate suitable responses from the virtual character. A small study showed that users do indeed design in a very embodied way using movement directly as a means of thinking through and designing interactions.
This paper presents the design and implementation of a software platform for creating interactive visualisations that respond to the free-form movements of a non-professional dancer. The visualisations can be trained to respond to the idiosyncratic movements of an individual dancer. This adaptive process is controlled by Interactive Machine Learning. Our approach is novel because the behaviour of the interactive visualisations is trained by a dancer dancing, rather than a computer scientist explicitly programming rules. In this way IML enables an 'embodied' form of design, where a dancer can design an interactive system by moving, rather than by analysing movement. This embodied design process taps into and supports our natural and embodied human understanding of movement.We hope the process of designing an interactive experience for free form dance will help us to understand more about how to create embodied interfaces and allow us to build a general frame-work for embodied interaction. We would also like to create a compelling, embodied and enjoyable experience with more satisfying interactions than previous dance computer games which use pre-scripted routines where a player must repeat a sequence of moves.The system was developed using a participatory methodology, with a software developer and an interaction designer working in partnership with users to test and refine two prototypes of the system. A third prototype has been built but not yet tested.
Cowie, Dorothy (2021) 'My virtual self: the role of movement in children's sense of embodiment.', IEEE Transactions on Visualization and Computer Graphics .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.