2013 IEEE International Conference on Robotics and Automation 2013
DOI: 10.1109/icra.2013.6631195
|View full text |Cite
|
Sign up to set email alerts
|

Real-time biopsy needle tip estimation in 2D ultrasound images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 15 publications
0
15
0
Order By: Relevance
“…A 3D optical tracking system (Accutrack 500, Atracsys LLC, a system with active markers and a mean position error of 0.19 mm) is used to estimate relative coordinate transformations among the robots and the phantom. Finally, the setup includes an ultrasound imaging device whose images are visualized on a dedicated graphical interface for the surgeons and processed in real time to detect the position of needle tip, using the algorithm developed by Mathiassen et al [32], and provide intraoperative adaptation of robot motion trajectories, as required by the surgical workflow previously described. For the suturing task, the ISUR robot is equipped with two arms (i.e.…”
Section: Robotic Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…A 3D optical tracking system (Accutrack 500, Atracsys LLC, a system with active markers and a mean position error of 0.19 mm) is used to estimate relative coordinate transformations among the robots and the phantom. Finally, the setup includes an ultrasound imaging device whose images are visualized on a dedicated graphical interface for the surgeons and processed in real time to detect the position of needle tip, using the algorithm developed by Mathiassen et al [32], and provide intraoperative adaptation of robot motion trajectories, as required by the surgical workflow previously described. For the suturing task, the ISUR robot is equipped with two arms (i.e.…”
Section: Robotic Setupmentioning
confidence: 99%
“…3.2, specifying the event-driven behavior required to coordinate robot actions and supervise the overall task execution. The cognitive part of the system is completed by Sensing software, which is in this case implementing real-time US image processing for needle tracking [32], and a Situation Awareness module, implementing Bayesian Networks [44] processing data received from Sensing and robot control software to detect events and exceptions (e.g. forbidden regions touched, force limits exceeded, etc.…”
Section: Implementation and Deploymentmentioning
confidence: 99%
“…An 3-D optical tracking system is used to estimate relative coordinate transformations among the robots and the phantom. Finally, the setup includes an ultrasound imaging device whose images can be visualized on a dedicated graphical interface for the surgeons and processed in real-time to detect the position of needle tip, as shown in Mathiassen et al (2013), and provide intra-operative adaptation of robot motion trajectories.…”
Section: Case Study and Robotic Setupmentioning
confidence: 99%
“…In addition, Ayvaci et al [18] performed biopsy needle segmentation on TRUS videos for use in MRI/TRUS fusion guided biopsy. Work on needle tip tracking has been performed by Mathiassen et al [19], who developed an optical tracking system based on intensity features in the images and Neubach et al [20], who used a 30…”
Section: Introductionmentioning
confidence: 99%