2018
DOI: 10.1109/tmi.2018.2794439
|View full text |Cite
|
Sign up to set email alerts
|

3-D Pose Estimation of Articulated Instruments in Robotic Minimally Invasive Surgery

Abstract: Estimating the 3-D pose of instruments is an important part of robotic minimally invasive surgery for automation of basic procedures as well as providing safety features, such as virtual fixtures. Image-based methods of 3-D pose estimation provide a non-invasive low cost solution compared with methods that incorporate external tracking systems. In this paper, we extend our recent work in estimating rigid 3-D pose with silhouette and optical flow-based features to incorporate the articulated degrees-of-freedom … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
38
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 72 publications
(38 citation statements)
references
References 42 publications
0
38
0
Order By: Relevance
“…In terms of applications, it should be noted that most solutions were developed to monitor endoscopic videos for minimally invasive surgeries, with or without robotic assistance (Sarikaya et al, 2017;Ross et al, 2018;Wesierski and Jezierska, 2018;Du et al, 2018;Allan et al, 2018). Other imaging modalities include:…”
Section: Clinical Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…In terms of applications, it should be noted that most solutions were developed to monitor endoscopic videos for minimally invasive surgeries, with or without robotic assistance (Sarikaya et al, 2017;Ross et al, 2018;Wesierski and Jezierska, 2018;Du et al, 2018;Allan et al, 2018). Other imaging modalities include:…”
Section: Clinical Applicationsmentioning
confidence: 99%
“…For flexible instruments, the goal is also to detect the tool centerline (Chang et al, 2016). Tool detection generally is an intermediate step for tool tracking, the process of monitoring tool location over time (Du et al, 2016;Rieke et al, 2016a;Lee et al, 2017b;Zhao et al, 2017;Czajkowska et al, 2018;Ryu et al, 2018;Keller et al, 2018), and pose estimation, the process of inferring a 2-D pose (Rieke et al, 2016b;Kurmann et al, 2017;Alsheakhali et al, 2016b;Du et al, 2018;Wesierski and Jezierska, 2018) or a 3-D pose (Allan et al, 2018;Gessert et al, 2018) based on the location of tool elements. Tasks associated with tool detection also include velocity estimation (Marban et al, 2017) and instrument state recognition (Sahu et al, 2016a).…”
Section: Computer Vision Tasksmentioning
confidence: 99%
“…However, many challenges in algorithm robustness hampering the translation of CAI methods relying on computer vision to the clinical practice. These include classification and segmentation of organs in the camera field of view (FoV) [3], definition of virtual-fixture algorithms to impose a safe distance between surgical tools and sensitive tissues [4], and surgical instrument detection, segmentation and articulated pose estimation [5], [6].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, robotic tool detection, segmentation, tracking and pose estimation are bound to become core technologies in a surgical work-flow in improving planning and understanding during the operation. In the context of delicate surgical procedures, such as urology, 3 it is paramount to provide the clinical operator with accurate real-time information about tool-tissue interactions, 4 3D position and orientation of the instruments, 5 etc., to increase the context-awareness of the operator whilst performing robotic intervention and helping to avoid human errors.…”
Section: Introductionmentioning
confidence: 99%