2010
DOI: 10.1007/s10439-010-9965-x
|View full text |Cite
|
Sign up to set email alerts
|

Automated Method for Computing the Morphological and Clinical Parameters of the Proximal Femur Using Heuristic Modeling Techniques

Abstract: Bone morphology and morphometric measurements of the lower limb provide significant and useful information for computer-assisted orthopedic surgery planning and intervention, surgical follow-up evaluation, and personalized prosthesis design. Femoral head radius and center, neck axis and size, femoral offset and shaft axis are morphological and functional parameters of the proximal femur utilized both in diagnosis and therapy. Obtaining this information from image data without any operator supervision or manual… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
37
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 29 publications
(37 citation statements)
references
References 34 publications
0
37
0
Order By: Relevance
“…Statistical models of bone structures have been investigated extensively with the aim of enhancing automation in CT and MR image segmentation [6][7][8][9][10][11][12] and reconstructing patientspecific shape models from a small number of X-ray images or even from a single image [13][14][15]. Innovative methods for bone shape analysis have been applied to 3D mesh segmentation and clinical feature detection [16][17][18][19][20][21]. In this domain, automatic shape analysis is strongly advocated because it offers the opportunity to detect morphological and clinical landmarks (MCL) with superior repeatability in comparison to human operators.…”
Section: Introductionmentioning
confidence: 99%
“…Statistical models of bone structures have been investigated extensively with the aim of enhancing automation in CT and MR image segmentation [6][7][8][9][10][11][12] and reconstructing patientspecific shape models from a small number of X-ray images or even from a single image [13][14][15]. Innovative methods for bone shape analysis have been applied to 3D mesh segmentation and clinical feature detection [16][17][18][19][20][21]. In this domain, automatic shape analysis is strongly advocated because it offers the opportunity to detect morphological and clinical landmarks (MCL) with superior repeatability in comparison to human operators.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, intra-and interobserver variability is an important factor that should be taken into account when using landmark-based clinical parameters. Several studies have been performed to assess the reproducibility of landmark identification on CT or MR (-based) images and the corresponding morphological parameters (Cerveri et al 2010, Lerner et al 2003, Nofrini et al 2004, Subburaj et al 2009, Victor et al 2009, Yoshino et al 2001), see Table 1. A direct comparison between studies is not feasible, because of the different methods that are used, but their results indicate that mean variabilities of 2-3 mm/° are not uncommon for some of the landmarks and parameters.…”
Section: Virtual Landmark Identificationmentioning
confidence: 99%
“…A method based on geometric-based analysis allowed the extraction of the femoral head and neck from the proximal femur, and a spherefitting algorithm was used to compute the center and radius of the femoral head [13]. The automatic processing of the distal femur surface allowed 3D segmentation of the condyles and intercondylar fossa [14].…”
Section: Introductionmentioning
confidence: 99%
“…In earlier publications [13,14], we proposed a computational framework to process the 3D shape model of the femur and extract the shaft, distal and proximal parts. A method based on geometric-based analysis allowed the extraction of the femoral head and neck from the proximal femur, and a spherefitting algorithm was used to compute the center and radius of the femoral head [13].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation