2020
DOI: 10.1016/j.jbiomech.2020.109832
|View full text |Cite
|
Sign up to set email alerts
|

Determining anatomical frames via inertial motion capture: A survey of methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 57 publications
(37 citation statements)
references
References 119 publications
0
30
0
Order By: Relevance
“…To do this using IMU sensors, it is necessary to determine the sensor orientation with respect to the segment orientation in a global reference system (sensor-to-segment alignment). A variety of methods can be adopted for this purpose, all of which are dependent on sensor fusion algorithms and determining the initial sensor-to-segment alignment via the implementation of calibration postures or functional movements that the participant must execute [ 5 ]. Acceptable levels of within- and between-participant, and within- and between-tester repeatability, can only be achieved with appropriate sensor fusion algorithm application and accurate and reliable calibration procedures [ 6 , 7 , 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…To do this using IMU sensors, it is necessary to determine the sensor orientation with respect to the segment orientation in a global reference system (sensor-to-segment alignment). A variety of methods can be adopted for this purpose, all of which are dependent on sensor fusion algorithms and determining the initial sensor-to-segment alignment via the implementation of calibration postures or functional movements that the participant must execute [ 5 ]. Acceptable levels of within- and between-participant, and within- and between-tester repeatability, can only be achieved with appropriate sensor fusion algorithm application and accurate and reliable calibration procedures [ 6 , 7 , 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…We mitigate these uncertainties by using MOCAP data to establish sensor-to-segment alignment and to manually correct misidentified footfalls. While this yields a method that is not truly "MOCAP-free", these topics (sensor-to-segment alignment and still period detection from IMU data alone) are themselves active areas of research for human applications [39][40][41].…”
Section: Plos Onementioning
confidence: 99%
“…The ensuing process of key release may also be reflected in the recorded brain dynamics; however, none of this behavioral and psychophysiological information is typically recorded, neither in past nor still current practice. Growing appreciation of the importance of the embodied cognition perspective on mental life (Shapiro, 2019), new lightweight, low cost methods of recording details of brain activities and motor behavior of experiment participants (Casson, 2019) (Jas et al, 2021) (Vitali & Perkins, 2020), and emergence of the practice of recording both brain activity and behavior (as well as psychophysiology) at higher resolution (sometimes termed Mobile Brain/Body Imaging or MoBI) (Makeig et al, 2009), make development of a suitable data annotation framework ever more urgent.…”
Section: Documenting Participant Responsesmentioning
confidence: 99%
“…Though here we focus on MEEG applications, event annotation standards and practices essential for MEEG data analysis can be applied equally well to other types of neuroimaging time series data including fMRI. For example, growing appreciation of the importance of embodied cognition on mental life (Shapiro, 2019), new lightweight, low cost methods of recording details of brain activities and motor behavior of experiment participants (Casson, 2019) (Jas et al, 2021) (Vitali & Perkins, 2020), and emergence of the practice of recording both brain activity and behavior (as well as psychophysiology) at higher resolution in a broader range of tasks and task environments (often termed Mobile Brain/Body Imaging or MoBI) (Makeig et al, 2009), make development of a suitable and more comprehensive data annotation framework ever more urgent.…”
Section: Introductionmentioning
confidence: 99%