This version is available at https://strathprints.strath.ac.uk/54353/ Strathprints is designed to allow users to access the research output of the University of Strathclyde. Unless otherwise explicitly stated on the manuscript, Copyright © and Moral Rights for the papers on this site are retained by the individual authors and/or other copyright owners. Please check the manuscript for details of any other licences that may have been applied. You may not engage in further distribution of the material for any profitmaking activities or any commercial gain. You may freely distribute both the url (https://strathprints.strath.ac.uk/) and the content of this paper for research or private study, educational, or not-for-profit purposes without prior permission or charge.Any correspondence concerning this service should be sent to the Strathprints administrator: strathprints@strath.ac.ukThe Strathprints institutional repository (https://strathprints.strath.ac.uk) is a digital archive of University of Strathclyde research outputs. It has been developed to disseminate open access research outputs, expose data about those outputs, and enable the management and persistent access to Strathclyde's intellectual output. Abstract Biofeedback from physical rehabilitation exercises has proved to lead to faster recovery, better outcomes, and increased patient motivation. In addition, it allows the physical rehabilitation processes carried out at the clinic to be complemented with exercises performed at home. However, currently existing approaches rely mostly on audio and visual reinforcement cues, usually presented to the user on a computer screen or a mobile phone interface. Some users, such as elderly people, can experience difficulties to use and understand these interfaces, leading to non-compliance with the rehabilitation exercises. To overcome this barrier, latest biosignal technologies can be used to enhance the efficacy of the biofeedback, decreasing the complexity of the user interface. In this paper we propose and validate a contextaware framework for the use of animatronic biofeedback, as a way of potentially increasing the compliance of elderly users with physical rehabilitation exercises performed at home. in the scope of our work, animatronic biofeedback entails the use of pre-programmed actions on a robot that are triggered in response to certain changes detected in the users' biomechanical or electrophysiological signals. We use electromyographic and accelerometer signals, collected in real time, to monitor the performance of the user while execut- ing the exercises, and a mobile robot to provide animatronic reinforcement cues associated with their correct or incorrect execution. A context-aware application running on a smartphone aggregates the sensor data and controls the animatronic feedback. The acceptability of the animatronic biofeedback has been tested on a set of volunteer elderly users. The results suggest that the participants found the animatronic feedback engaging and of added value.
Abstract-Standard microphones and ultrasonic devices are generally designed with a static and flat frequency response in order to address multiple acoustic applications. However, they may not be flexible or adaptable enough to deal with some requirements. For instance, when operated in noisy environments such devices may be vulnerable to wideband background noise which will require further signal processing techniques to remove it, generally relying on digital processor units. In this work, we consider if microphones and ultrasonic devices could be designed to be sensitive only at selected frequencies of interest, whilst also providing flexibility in order to adapt to different signals of interest and to deal with environmental demands. This research exploits the concept where the "transducer becomes part of the signal processing chain" by exploring feedback processes between mechanical and electrical mechanisms that together can enhance peripheral sound processing. This capability is present within a biological acoustic system, namely in the ears of certain moths. That was used as the model of inspiration for a smart acoustic sensor system which provides dynamic adaptation of its frequency response with amplitude and time dependency according to the input signal of interest.
Natural passive mechanical systems such as ear tympanic membranes may show active responses by incorporating feedback mechanisms which then affect their mechanical structure. In this paper, the moth’s auditory system is used as a biological model of inspiration. A smart acoustic system which alters its natural resonance frequency was developed. Experimental results, given by a proposed-built real-time embedded system, show time and amplitude dependency towards dynamic frequency adaptation according to the intensity of acoustic input signals
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.