2018
DOI: 10.3389/fneur.2018.00652
|View full text |Cite
|
Sign up to set email alerts
|

Validation of a Lower Back “Wearable”-Based Sit-to-Stand and Stand-to-Sit Algorithm for Patients With Parkinson's Disease and Older Adults in a Home-Like Environment

Abstract: Introduction: Impaired sit-to-stand and stand-to-sit movements (postural transitions, PTs) in patients with Parkinson's disease (PD) and older adults (OA) are associated with risk of falling and reduced quality of life. Inertial measurement units (IMUs, also called “wearables”) are powerful tools to monitor PT kinematics. The purpose of this study was to develop and validate an algorithm, based on a single IMU positioned at the lower back, for PT detection and description in the above-mentioned groups in a hom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
51
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 39 publications
(51 citation statements)
references
References 64 publications
0
51
0
Order By: Relevance
“…Multimodal describes data captured from two or more unique measurement methods. For example, a combination of accelerometer and gyroscope data can be used to detect falls and sit-to-stand transitions 35,36 . Digital tools relying on multimodal data should have evidence of verification available for each sensor, and evidence of analytical validation and clinical validation for the measure itself.…”
Section: Extending V3 Concepts To Multimodal and Composite Digital Mementioning
confidence: 99%
“…Multimodal describes data captured from two or more unique measurement methods. For example, a combination of accelerometer and gyroscope data can be used to detect falls and sit-to-stand transitions 35,36 . Digital tools relying on multimodal data should have evidence of verification available for each sensor, and evidence of analytical validation and clinical validation for the measure itself.…”
Section: Extending V3 Concepts To Multimodal and Composite Digital Mementioning
confidence: 99%
“…The current generation of wearable sensors are limited by low-fidelity, low resolution, or uni-dimensional data analysis (e.g. velocity) based on gross assumptions of linear regression, which overfit to a simple movement pattern or participant cohort [7], however, researchers have reported success deriving kinematics from these devices for movement classification [40,52]. To improve on these methods, a number of research teams have sought to leverage computer vision and data science techniques, and while initial results appear promising, to date they lack validation to ground truth data, or relevance to specific sporting related tasks [8,49,53].…”
Section: Introductionmentioning
confidence: 99%
“…Instead of quantifying the duration of body positions, it is also common to count the transitions between these positions. The transition between sitting and standing was frequently investigated [ 40 , 41 , 48 , 53 , 57 , 59 , 63 73 ], while only three studies detected the transition between lying and sitting [ 48 , 57 , 70 ]. Three of these studies further discriminated between transitions and bending forward [ 53 , 65 , 67 ], and two additional studies specifically detected sit-to-walk transitions since they aimed to compare the timed up and go test with transitions in daily life [ 74 , 75 ].…”
Section: Resultsmentioning
confidence: 99%
“…Three of these studies further discriminated between transitions and bending forward [ 53 , 65 , 67 ], and two additional studies specifically detected sit-to-walk transitions since they aimed to compare the timed up and go test with transitions in daily life [ 74 , 75 ]. Standing up was further analyzed in terms of speed [ 40 , 63 , 64 , 68 , 71 , 73 75 ], range of motion [ 40 , 64 , 71 , 74 , 75 ], and smoothness [ 74 , 75 ]. Only one study detected transfers (i.e., moving from one surface to another without changing body position) [ 62 ].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation