2015
DOI: 10.1016/j.compmedimag.2014.09.003
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based endoscope tracking for 3D ultrasound image-guided surgical navigation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

1
17
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 36 publications
1
17
0
Order By: Relevance
“…Increasing number of practical utilizations, such as vision based robotics control [58,71,80,108,111,122], surveillance [3,21,25,38,41,89,94,125], navigation [27,32,46,53,70,90,100,121], and electronics in consumer related fields [9,40,49,65,84,113,119], provide applied scenarios for the vision based computation. Most of these applications can be technically attributed into several typical technologies with respect to vision based computation.…”
Section: Introductionmentioning
confidence: 99%
“…Increasing number of practical utilizations, such as vision based robotics control [58,71,80,108,111,122], surveillance [3,21,25,38,41,89,94,125], navigation [27,32,46,53,70,90,100,121], and electronics in consumer related fields [9,40,49,65,84,113,119], provide applied scenarios for the vision based computation. Most of these applications can be technically attributed into several typical technologies with respect to vision based computation.…”
Section: Introductionmentioning
confidence: 99%
“…However, in most implementations, the acquired data are processed after the acquisition and cannot be directly correlated with the sample, delaying the diagnostic and analytical result. Recently, various imaging modalities, such as computed tomography, magnetic resonance imaging, ultrasound image and fluorescence lifetime have been combining their respective imaging information with images of the sample in an augmented way to improve or supplement real‐time surgery . These approaches offer the surgeons a direct interpretation of physical information from the tissues by superimposing the imaging modalities information into the visual field of view, allowing the surgeon to detect and determine locations of disease in a real‐time.…”
Section: Introductionmentioning
confidence: 99%
“…They range from localization of robots and object manipulation (Choi andChristensen, 2012, Collet et al, 2009), to augmented reality (Müller et al, 2013) implementations running on mobile devices and having only limited resources, thus focusing on fast solutions. Especially in industry, surveying or medical environments, involving machine vision, (close-range) photogrammetry (Luhmann et al, 2006), point-cloud registration (Weinmann et al, 2011) and surgical navigation (Yang et al, 2015), methods are demanded, that are not only robust, but also return a measure of reliability.…”
Section: Introductionmentioning
confidence: 99%