In chronic pain physical rehabilitation, physiotherapists adapt exercise sessions according to the movement behavior of patients. As rehabilitation moves beyond clinical sessions, technology is needed to similarly assess movement behaviors and provide such personalized support. In this paper, as a first step, we investigate automatic detection of protective behavior (movement behavior due to pain-related fear or pain) based on wearable motion capture and electromyography sensor data. We investigate two recurrent networks (RNN) referred to as stacked-LSTM and dual-stream LSTM, which we compare with related deep learning (DL) architectures. We further explore data augmentation techniques and additionally analyze the impact of segmentation window lengths on detection performance. The leading performance of 0.815 mean F1 score achieved by stacked-LSTM provides important grounding for the development of wearable technology to support chronic pain physical rehabilitation during daily activities.
People with chronic musculoskeletal pain would benefit from technology that provides run-time personalized feedback and help adjust their physical exercise plan. However, increased pain during physical exercise, or anxiety about anticipated pain increase, may lead to setback and intensified sensitivity to pain. Our study investigates the possibility of detecting pain levels from the quality of body movement during two functional physical exercises. By analyzing recordings of kinematics and muscle activity, our feature optimization algorithms and machine learning techniques can automatically discriminate between people with low level pain and high level pain and control participants while exercising. Best results were obtained from feature set optimization algorithms: 94% and 80% for the full trunk flexion and sit-to-stand movements respectively using Support Vector Machines. As depression can affect pain experience, we included participants' depression scores on a standard questionnaire and this improved discrimination between the control participants and the people with pain when Random Forests were used.
Although clinical best practice suggests that affect awareness could enable more effective technological support for physical rehabilitation through personalisation to psychological needs, designers need to consider what affective states matter, and how they should be tracked and addressed. In this article, we set the standard by analysing how the major affective factors in chronic pain (pain, fear/anxiety, and low/depressed mood) interfere with everyday physical functioning. Further, based on discussion of the modality that should be used to track these states to enable technology to address them, we investigated the possibility of using movement behaviour to automatically detect the states. Using two body movement datasets on people with chronic pain, we show that movement behaviour enables very good discrimination between two emotional distress levels (F1=0.86), and three pain levels (F1=0.9). Performance remained high (F1=0.78 for two pain levels) with a reduced set of movement sensors. Finally, in an overall discussion, we suggest how technology-provided encouragement and awareness can be personalised given the capability to automatically monitor the relevant states, towards addressing the barriers that they pose. In addition, we highlight movement behaviour features to be tracked to provide technology with information necessary for such personalisation.
Fig. 1. The overview of the Body Attention Network. Each body part is described by the joint angle plus energy. Data collected from feet were noisy and hence not used in this work.Abstract-For people with chronic pain, the assessment of protective behavior during physical functioning is essential to understand their subjective pain-related experiences (e.g., fear and anxiety toward pain and injury) and how they deal with such experiences (avoidance or reliance on specific body joints), with the ultimate goal of guiding intervention. Advances in deep learning (DL) can enable the development of such intervention. Using the EmoPain MoCap dataset, we investigate how attention-based DL architectures can be used to improve the detection of protective behavior by capturing the most informative temporal and body configurational cues characterizing specific movements and the strategies used to perform them. We propose an end-to-end deep learning architecture named BodyAttentionNet (BANet). BANet is designed to learn temporal and bodily parts that are more informative to the detection of protective behavior. The approach addresses the variety of ways people execute a movement (including healthy people) independently of the type of movement analyzed.Through extensive comparison experiments with other state-of-the-art machine learning techniques used with motion capture data, we show statistically significant improvements achieved by using these attention mechanisms. In addition, the BANet architecture requires a much lower number of parameters than the state of the art for comparable if not higher performances.
Physical activity is essential in chronic pain rehabilitation. However, anxiety due to pain or a perceived exacerbation of pain causes people to guard against beneficial exercise. Interactive rehabiliation technology sensitive to such behaviour could provide feedback to overcome such psychological barriers. To this end, we developed a Support Vector Machine framework with the feature level fusion of body motion and muscle activity descriptors to discriminate three levels of pain (none, low and high). All subjects underwent a forward reaching exercise which is typically feared among people with chronic back pain. The levels of pain were categorized from control subjects (no pain) and thresholded self reported levels from people with chronic pain. Salient features were identified using a backward feature selection process. Using feature sets from each modality separately led to high pain classification F1 scores of 0.63 and 0.69 for movement and muscle activity respectively. However using a combined bimodal feature set this increased to F1 = 0.8.
Abstract-Clinicians tailor intervention in chronic pain rehabilitation to movement related self-efficacy (MRSE). This motivates us to investigate automatic MRSE estimation in this context towards the development of technology that is able to provide appropriate support in the absence of a clinician. We first explored clinical observer estimation, which showed that body movement behaviours, rather than facial expressions or engagement behaviours, were more pertinent to MRSE estimation during physical activity instances. Based on our findings, we built a system that estimates MRSE from bodily expressions and bodily muscle activity captured using wearable sensors. Our results (F1 scores of 0.95 and 0.78 in two physical exercise types) provide evidence of the feasibility of automatic MRSE estimation to support chronic pain physical rehabilitation. We further explored automatic estimation of MRSE with a reduced set of low-cost sensors to investigate the possibility of embedding such capabilities in ubiquitous wearable devices to support functional activity. Our evaluation for both exercise and functional activity resulted in F1 score of 0.79. This result suggests the possibility of (and calls for more studies on) MRSE estimation during everyday functioning in ubiquitous settings. We provide a discussion of the implication of our findings for relevant areas.
Touch is a primary nonverbal communication channel used to communicate emotions or other social messages. A variety of social touch exists including hugging, rubbing and punching. Despite its importance, this channel is still very little explored in the affective computing field, as much more focus has been placed on visual and aural channels. In this paper, we investigate the possibility to automatically discriminate between different social touch types. We propose five distinct feature sets for describing touch behaviours captured by a grid of pressure sensors. These features are then combined together by using the Random Forest and Boosting methods for categorizing the touch gesture type. The proposed methods were evaluated on both the HAART (7 gesture types over different surfaces) and the CoST (14 gesture types over the same surface) datasets made available by the Social Touch Gesture Challenge 2015. Well above chance level performances were achieved with a 67% accuracy for the HAART and 59% for the CoST testing datasets respectively.
Supplemental Digital Content is Available in the Text.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.