A medical robotic system for teleoperated laser microsurgery based on a concept we have called "virtual scalpel" is presented in this paper. This system allows surgeries to be safely and precisely performed using a graphics pen directly over a live video from the surgical site. This is shown to eliminate hand-eye coordination problems that affect other microsurgery systems and to make full use of the operator's manual dexterity without requiring extra training. The implementation of this system, which is based on a tablet PC and a new motorized laser micromanipulator offering 1µm aiming accuracy within the traditional line-of-sight 2D operative space, is fully described. This includes details on the system's hardware and software structures and on its calibration process, which is essential for guaranteeing precise matching between a point touched on the live video and the laser aiming point at the surgical site. Together, the new hardware and software structures make both the calibration parameters and the laser aiming accuracy (on any plane orthogonal to the imaging axis) independent of the target distance and of its motions. Automatic laser control based on new intraoperative planning software and safety improvements based on virtual features are also described in this paper, which concludes by presenting results from sets of path following evaluation experiments conducted with 10 different subjects. These demonstrate an error reduction of almost 50% when using the virtual scalpel system versus the traditional laser microsurgery setup, and an 80% error reduction when using the automatic laser control routines, evidencing great improvements in terms of precision and controllability, and suggesting that the technological advances presented herein will lead to a significantly enhanced capacity for treating a variety of internal human pathologies.
This paper proposes an automated robotic system to perform cell microinjections to relieve human operators from this highly difficult and tedious manual procedure. The system, which uses commercial equipment currently found on most biomanipulation laboratories, consists of a multitask software framework combining computer vision and robotic control elements. The vision part features an injection pipette tracker and an automatic cell targeting system that is responsible for defining injection points within the contours of adherent cells in culture. The main challenge is the use of bright-field microscopy only, without the need for chemical markers normally employed to highlight the cells. Here, cells are identified and segmented using a threshold-based image processing technique working on defocused images. Fast and precise microinjection pipette positioning over the automatically defined targets is performed by a two-stage robotic system which achieves an average injection rate of 7.6 cells/min with a pipette positioning precision of 0.23 μm. The consistency of these microinjections and the performance of the visual targeting framework were experimentally evaluated using two cell lines (CHO-K1 and HEK) and over 500 cells. In these trials, the cells were automatically targeted and injected with a fluorescent marker, resulting in a correct cell detection rate of 87% and a successful marker delivery rate of 67.5%. These results demonstrate that the new system is capable of better performances than expert operators, highlighting its benefits and potential for large-scale application.
This research investigates the impact of three different control devices and two visualization methods on the precision, safety and ergonomics of a new medical robotic system prototype for assistive laser phonomicrosurgery. This system allows the user to remotely control the surgical laser beam using either a flight simulator type joystick, a joypad, or a pen display system in order to improve the traditional surgical setup composed by a mechanical micromanipulator coupled with a surgical microscope. The experimental setup and protocol followed to obtain quantitative performance data from the control devices tested are fully described here. This includes sets of path following evaluation experiments conducted with ten subjects with different skills, for a total of 700 trials. The data analysis method and experimental results are also presented, demonstrating an average 45% error reduction when using the joypad and up to 60% error reduction when using the pen display system versus the standard phonomicrosurgery setup. These results demonstrate the new system can provide important improvements in terms of surgical precision, ergonomics and safety. In addition, the evaluation method presented here is shown to support an objective selection of control devices for this application.
This article introduces a novel approach to robust automatic detection of unstained living cells in bright-field (BF) microscope images with the goal of producing a target list for an automated microinjection system. The overall image analysis process is described and includes: preprocessing, ridge enhancement, image segmentation, shape analysis and injection point definition. The developed algorithm implements a new version of anisotropic contour completion (ACC) based on the partial differential equation (PDE) for heat diffusion which improves the cell segmentation process by elongating the edges only along their tangent direction. The developed ACC algorithm is equivalent to a dilation of the binary edge image with a continuous elliptic structural element that takes into account local orientation of the contours preventing extension towards normal direction. Experiments carried out on real images of 10 to 50 microm CHO-K1 adherent cells show a remarkable reliability in the algorithm along with up to 85% success for cell detection and injection point definition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.