2020
DOI: 10.1098/rsif.2019.0715
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of absolute states of human skeletal muscle via standard B-mode ultrasound imaging and deep convolutional neural networks

Abstract: Objective: To test automated in vivo estimation of active and passive skeletal muscle states using ultrasonic imaging. Background: Current technology (electromyography, dynamometry, shear wave imaging) provides no general, non-invasive method for online estimation of skeletal intramuscular states. Ultrasound (US) allows non-invasive imaging of muscle, yet current computational approaches have never achieved simultaneous extraction nor generalisation of independently varying, active and passive states. We use d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(18 citation statements)
references
References 69 publications
0
15
0
Order By: Relevance
“…well as to estimate/predict states generally from individual muscles, given the fact that the information is well encoded in skeletal muscles' collagen structure and are observable by using US imaging (Cunningham and Loram, 2020). Recent research studies have also applied US imaging þ deep (machine) learning to estimate skeletal muscles' activation levels (Cunningham et al, 2017b;Cunningham and Loram, 2020;Feigin et al, 2020), fascicle length (Rosa et al, 2021), fascicle orientation (Cunningham et al, 2017a), and muscle segmentation (Carneiro and Nascimento, 2013;Zhou et al, 2020).…”
Section: Scientific and Clinical Significancementioning
confidence: 99%
See 2 more Smart Citations
“…well as to estimate/predict states generally from individual muscles, given the fact that the information is well encoded in skeletal muscles' collagen structure and are observable by using US imaging (Cunningham and Loram, 2020). Recent research studies have also applied US imaging þ deep (machine) learning to estimate skeletal muscles' activation levels (Cunningham et al, 2017b;Cunningham and Loram, 2020;Feigin et al, 2020), fascicle length (Rosa et al, 2021), fascicle orientation (Cunningham et al, 2017a), and muscle segmentation (Carneiro and Nascimento, 2013;Zhou et al, 2020).…”
Section: Scientific and Clinical Significancementioning
confidence: 99%
“…Therefore, the motivation of this work is to investigate the continuous joint volitional effort prediction for lower limb joint functionalities by using high-dimensional features from US imaging and a deep learning method. To the best of our knowledge, only a few recent studies have investigated deep learning approaches for continuous ankle joint kinematics, kinetics, and muscle state estimation (Cunningham et al, 2017b;Cunningham and Loram, 2020). However, they only focused on the active and passive ankle joint movement tasks at the standing posture, and no functional dynamic locomotion tasks were discussed.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Another recent study into the utility of neural networks has been to use simple twodimensional ultrasound to estimate muscle states during passive and active contractions (Cunningham and Loram 2020). The authors trained a convolutional neural network based on inputs of joint angle, moment, and EMG with the associated ultrasound image of the muscle.…”
Section: About Here>mentioning
confidence: 99%
“…EMG is meant to capture involuntary muscle activations, but this technique suffers from low signal reproducibility due to issues with electrode placement and motion artifacts ( Misgeld et al., 2016 ; Sloot et al., 2017 ; Yu et al., 2020 ; Wang et al., 2017 ). Ultrasound has been used to inspect muscle fiber lengths and cross-sectional area that relate to muscle strength ( Moreau et al., 2009 ; Cunningham and Loram, 2020 ). Yet this method does not address the motion-dependent aspects of spasticity.…”
Section: Introductionmentioning
confidence: 99%