2015
DOI: 10.1016/j.compmedimag.2014.06.016
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced needle localization in ultrasound using beam steering and learning-based segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
44
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 46 publications
(45 citation statements)
references
References 34 publications
(42 reference statements)
0
44
1
Order By: Relevance
“…Most assume that the needle exhibits the brightest pixels in the image; furthermore, they identify linear structures (Radon and Hough transform), filter‐specific transitions of bright and dark pixels (Gabor filter) and can eliminate outliers based on statistical probability (Kalman and RANSAC). In 2D scanning, detection of the needle angle allows the US beam to be perfectly steered for needle visibility enhancement . If combined with 3D scanning, automated needle detection algorithms do not require perfect alignment of needle and probe, addressing the problem of unintentional probe manipulation during interventions.…”
Section: Resultsmentioning
confidence: 99%
“…Most assume that the needle exhibits the brightest pixels in the image; furthermore, they identify linear structures (Radon and Hough transform), filter‐specific transitions of bright and dark pixels (Gabor filter) and can eliminate outliers based on statistical probability (Kalman and RANSAC). In 2D scanning, detection of the needle angle allows the US beam to be perfectly steered for needle visibility enhancement . If combined with 3D scanning, automated needle detection algorithms do not require perfect alignment of needle and probe, addressing the problem of unintentional probe manipulation during interventions.…”
Section: Resultsmentioning
confidence: 99%
“…In recent work, Hatt et al introduced a machine-learning approach to segment the needle in beamsteered B-mode images with the needle orientation known a priori. It uses a classifier to segment pixels as needle or background, followed by a Radon transform for localization [9]. With these approaches however, steering to large angles remains a challenge, especially for curvilinear array transducers, and tissue boundaries are also enhanced.…”
Section: Introductionmentioning
confidence: 99%
“…Transform (e.g., see [18]), Gabor filter (e.g. see [19]), Monogenic Decomposition (e.g. see [20]), or Fractual Analysis (e.g.…”
Section: Introductionmentioning
confidence: 99%