2017
DOI: 10.1109/jbhi.2016.2551459
|View full text |Cite
|
Sign up to set email alerts
|

Toward Unobtrusive Patient Handling Activity Recognition for Injury Reduction Among At-Risk Caregivers

Abstract: Nurses regularly perform patient handling activities. These activities with awkward postures expose healthcare providers to a high risk of overexertion injury. The recognition of patient handling activities is the first step to reduce injury risk for caregivers. The current practice on workplace activity recognition is based on human observational approach, which is neither accurate nor projectable to a large population. In this paper, we aim at addressing these challenges. Our solution comprises a smart weara… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(13 citation statements)
references
References 44 publications
0
13
0
Order By: Relevance
“…For step detection, instrumented insoles were validated using visual observation [25,31,34,35], other devices (the Runtastic pedometer application and other smartphone applications) [34,36], or using a predefined number of steps [24,36,37] (see Table 3). To validate the instrumented insoles for posture and activity recognition, comparisons were made between the smart insole data and that collected from direct observation during data collection or from a video recording or from other wearable devices (2D accelerometer (ADXL202), gyroscope (Murata, ENC-03J), ActivPAL device, PPAC (plantar-pressure based ambulatory classification) and FF (foot force sensor) + GPS [18,26,31,32,33,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60] (see Table 2).…”
Section: Resultsmentioning
confidence: 99%
“…For step detection, instrumented insoles were validated using visual observation [25,31,34,35], other devices (the Runtastic pedometer application and other smartphone applications) [34,36], or using a predefined number of steps [24,36,37] (see Table 3). To validate the instrumented insoles for posture and activity recognition, comparisons were made between the smart insole data and that collected from direct observation during data collection or from a video recording or from other wearable devices (2D accelerometer (ADXL202), gyroscope (Murata, ENC-03J), ActivPAL device, PPAC (plantar-pressure based ambulatory classification) and FF (foot force sensor) + GPS [18,26,31,32,33,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60] (see Table 2).…”
Section: Resultsmentioning
confidence: 99%
“…18(a)). The external commercial ICs of References [8]- [11] can also be used to separate the input signal; however, the complexity and area of modules will be higher. V BC is used to collect the variations of the CCA controlled by the MUX through digital codes (D4-D6).…”
Section: Ca and Self-separated Input Signalmentioning
confidence: 99%
“…27(b). Table 17 shows the relative errors according to the following fitting curve equation: Table 18 shows the differences between Table 14 and Equation (8). The display value on the display interface is converted by an equation.…”
Section: B Ccpp Sensormentioning
confidence: 99%
See 1 more Smart Citation
“…There are many promising applications for action and event recognition, such as abnormal action and event recognition in surveillance applications [1][2][3][4], interaction action and event recognition in entertainment applications [5][6][7][8], and home-based rehabilitation action and event recognition in healthcare applications [9][10][11][12], and many other analogous applications such as in [13][14][15][16][17][18]. According to the definition given by NIST [19], an event is a complex activity occurring at a specific place and time, which involves people interacting with other people and/or objects, and consists of a number of human actions, processes, and activities.…”
Section: Introductionmentioning
confidence: 99%