2018
DOI: 10.1016/j.media.2018.09.006
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid camera- and ultrasound-based approach for needle localization and tracking using a 3D motorized curvilinear ultrasound probe

Abstract: A hybrid camera-and ultrasound-based approach for needle localization and tracking using a 3D motorized curvilinear ultrasound probe,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 45 publications
(99 reference statements)
0
8
0
Order By: Relevance
“…13 Beigi et al trained a probabilistic SVM using the temporal features for pixel classification and computed the probability map of the segmented pixels, followed by using Hough transform for needle localization.15 Besides, a hybrid camera-and USbased method was designed in the work of Daoud et al to localize and track the inserted needle. 16 Recently, deep convolutional neural networks (CNNs) show the capabilities on learning hierarchical features to build the mappings from data space to objective space.17 Because of the power of nonlinear fitting, CNNs have been shown to achieve outstanding performance on various medical image-based tasks,18 such as segmentation,19-23 cancer diagnosis,24 and localization.10 For needle segmentation and localization, Pourtaherian et al trained a CNN model to identify the needle voxels from other echogenic structures and also built a fully convolutional networks (FCN) to label the needle parts, where both methods were followed by RANSAC for needle-axis estimation and visualization.25 And they also made an attempt on adopting dilated CNNs to localize the partially visible needles in US images. 26 In addition, to implement automatic needle segmentation in MRI, Mehrtash et al presented a deep anisotropic 3D FCN with skip-connections10 that is inspired by the 3D U-Net model.27 Multi-needle detection in 3D US images for US-guided prostate brachytherapy is lacking attention in current studies.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…13 Beigi et al trained a probabilistic SVM using the temporal features for pixel classification and computed the probability map of the segmented pixels, followed by using Hough transform for needle localization.15 Besides, a hybrid camera-and USbased method was designed in the work of Daoud et al to localize and track the inserted needle. 16 Recently, deep convolutional neural networks (CNNs) show the capabilities on learning hierarchical features to build the mappings from data space to objective space.17 Because of the power of nonlinear fitting, CNNs have been shown to achieve outstanding performance on various medical image-based tasks,18 such as segmentation,19-23 cancer diagnosis,24 and localization.10 For needle segmentation and localization, Pourtaherian et al trained a CNN model to identify the needle voxels from other echogenic structures and also built a fully convolutional networks (FCN) to label the needle parts, where both methods were followed by RANSAC for needle-axis estimation and visualization.25 And they also made an attempt on adopting dilated CNNs to localize the partially visible needles in US images. 26 In addition, to implement automatic needle segmentation in MRI, Mehrtash et al presented a deep anisotropic 3D FCN with skip-connections10 that is inspired by the 3D U-Net model.27 Multi-needle detection in 3D US images for US-guided prostate brachytherapy is lacking attention in current studies.…”
Section: Introductionmentioning
confidence: 99%
“…Beigi et al trained a probabilistic SVM using the temporal features for pixel classification and computed the probability map of the segmented pixels, followed by using Hough transform for needle localization 15. Besides, a hybrid camera‐ and US‐based method was designed in the work of Daoud et al to localize and track the inserted needle 16. Recently, deep convolutional neural networks (CNNs) show the capabilities on learning hierarchical features to build the mappings from data space to objective space 17.…”
Section: Introductionmentioning
confidence: 99%
“…3. Although there is a recent study focusing on 3D volumetric data with temporal information [49], they considered an extra camera for giving support information, which obtained a better performance than ROI-based Kalman filtering.…”
Section: ) Parametric Space Methodsmentioning
confidence: 99%
“…Needle biopsy/anesthesia/therapy ex-vivo Kaya et al [41] 2015 2D+t Needle biopsy/drug delivery in-vitro Pourtaherian et al [37] 2016 3D Needle anesthesia/ablation ex-vivo Beigi et al [45] 2016 2D+t Needle biopsy/nerve block/anesthesias in-vitro/in-vivo Mwikirize et al [43] 2016 2D Needle biopsy/ablation/anesthesia ex-vivo Beigi et al [48] 2016 2D+t Needle biopsy/nerve block/anesthesia in-vivo Kaya et al [46] 2016 2D+t Needle biopsy/drug delivery in-vitro Daoud et al [49] 2018 3D Needle intervention ex-vivo Daoud et al [24] 2018 2D Needle intervention ex-vivo Agarwal et al [35] 2019 2D+t Anesthesia/biopsy/brachytherapy in-vitro Needle biopsy/ablation/anesthesia ex-vivo Yang et al [61] 2019 3D Cardiac catheterization in-vitro/ex-vivo/in-vivo Yang et al [70] 2019 3D Cardiac catheterization ex-vivo Yang et al [82] 2019 3D Cardiac catheterization ex-vivo Yang et al [83] 2019 3D Cardiac catheterization ex-vivo Mwikirize et al [89] 2019 2D+t Needle biopsy/anesthesia in-vitro/ex-vivo Mwikirize et al [88] 2019 2D+t Needle biopsy/anesthesia ex-vivo Arif et al [87] 2019 3D Needle biopsy in-vitro/in-vivo Yang et al [85] 2019 3D Cardiac catheterization ex-vivo/in-vivo Min et al [76] 2020 3D Cardiac catheterization ex-vivo Rodgers et al [80] 2020 2D/3D Interstitial gynecologic brachytherapy in-vitro/in-vivo Zhang et al [68], [67] 2020 3D prostate brachytherapy in-vivo Zhang et al [86] 2020 3D Prostate brachytherapy in-vivo Zhang et al [90] 2020 3D Prostate brachytherapy in-vivo Lee et al [79] 2020 2D Needle biopsy in-vivo With the above summaries, there are remaining some challenges and limitations for this area, despite the current methods obtain satisfactory results. We discuss these challenges below.…”
Section: Referencementioning
confidence: 99%
“…We used a multi-part loss function as in basic Yolo [58] to train the nYolo network. As shown in (9), the total loss is composed of three parts, the Needle-ness loss (£ 𝑁 ), the confidence score of the existence of the needle (£ 𝐶 ) and the regressor loss (£ 𝐿 ) to predict line parameters in each grid cell. The term 𝟙 𝑖 𝑁 denotes the existence and 𝟙 𝑖 𝑁𝑜𝑁 the absence of the needle in cell i.…”
Section: Nyolo Needle Detection Networkmentioning
confidence: 99%