2018
DOI: 10.1007/s11548-018-1721-y
|View full text |Cite
|
Sign up to set email alerts
|

Convolution neural networks for real-time needle detection and localization in 2D ultrasound

Abstract: The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
41
1

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 57 publications
(43 citation statements)
references
References 16 publications
1
41
1
Order By: Relevance
“…Our precision (83.2 [70.9, 89.8]%) was much higher than that observed by Mwikirize et al 30 prior to preprocessing, with a similar recall rate; however, Mwikirize et al emphasized the importance of preprocessing US images to reduce high-intensity artifacts in the image, providing a vast improvement in the recall and precision rates of their algorithm. Although preprocessing was avoided in our study to more closely reflect the intraoperative clinical realities, this may be an area of future investigation to further improve the accuracy of our results.…”
Section: Discussioncontrasting
confidence: 45%
See 2 more Smart Citations
“…Our precision (83.2 [70.9, 89.8]%) was much higher than that observed by Mwikirize et al 30 prior to preprocessing, with a similar recall rate; however, Mwikirize et al emphasized the importance of preprocessing US images to reduce high-intensity artifacts in the image, providing a vast improvement in the recall and precision rates of their algorithm. Although preprocessing was avoided in our study to more closely reflect the intraoperative clinical realities, this may be an area of future investigation to further improve the accuracy of our results.…”
Section: Discussioncontrasting
confidence: 45%
“…[29][30][31][32][33][34][35][36][37][38] Additionally, approaches leveraging the principles of transfer learning (applying models developed for a prior task to a new context) have been investigated in the broader context of image segmentation, including a recent study in x-ray computed tomography (CT) images by Jung et al 39 In this paper, Jung et al 39 investigated the generalizability of a CNN-based method developed on a tandem-andovoids intracavitary brachytherapy device to segment other devices used for this type of procedure, substantiating the feasibility of a generalized tool segmentation method. Some existing works have specifically investigated CNN-based approaches to tool segmentation in US images, [29][30][31]34,37,38 recognizing that the appearance of tools in this imaging modality introduces unique challenges. Recent developments by Pourtaherian et al, 31 Arif et al, 29 and Zhang et al 38 focused on needle detection in 3D US images.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In another study, signal transmission maps and statistical optimization were used to model attenuation information and detect the needle tip. They combined this pipeline with a deep learning model to filter out motion events [24][25]. Daoud et al [26] used the needleinduced motion and edge detection to localize needles in US images.…”
Section: Related Studies In the Literaturementioning
confidence: 99%
“…Recently, a template-based tracking method with the efficient second-order minimization optimization method has been used to track the needle [16]. In recent studies, more and more novel ideas have been used to locate the needle and evaluate its tip to sagittal US images, such as the use of signal attenuation maps [17], convolution neural networks (CNN) [18], and maximum likelihood estimation sample consensus (MLESAC) method [19]. However, a demerit of using sagittal US images is that out-of-plane bending of the needle cannot be detected.…”
Section: Introductionmentioning
confidence: 99%