In the context of ultrasound (US) guided breast biopsy, image fusion techniques can be employed to track the position of USinvisible lesions previously identified on a pre-operative image. Such methods have to account for the large anatomical deformations resulting from probe pressure during US scanning within the real-time constraint. Although biomechanical models based on the finite element (FE) method represent the preferred approach to model breast behavior, they cannot achieve real-time performances. In this paper we propose to use deep neural networks to learn large deformations occurring in ultrasoundguided breast biopsy and then to provide accurate prediction of lesion displacement in real-time. We train a U-Net architecture on a relatively small amount of synthetic data generated in an offline phase from FE simulations of probe-induced deformations on the breast anatomy of interest. Overall, both training data generation and network training are performed in less than 5 hours, which is clinically acceptable considering that the biopsy can be performed at most the day after the pre-operative scan. The method is tested both on synthetic and on real data acquired on a realistic breast phantom. Results show that our method correctly learns the deformable behavior modelled via FE simulations and is able to generalize to real data, achieving a target registration error comparable to that of FE models, while being about a hundred times faster.
Purpose Although ultrasound (US) images represent the most popular modality for guiding breast biopsy, malignant regions are often missed by sonography, thus preventing accurate lesion localization which is essential for a successful procedure. Biomechanical models can support the localization of suspicious areas identified on a pre-operative image during US scanning since they are able to account for anatomical deformations resulting from US probe pressure. We propose a deformation model which relies on position-based dynamics (PBD) approach to predict the displacement of internal targets induced by probe interaction during US acquisition. Methods The PBD implementation available in NVIDIA FleX is exploited to create an anatomical model capable of deforming online. Simulation parameters are initialized on a calibration phantom under different levels of probe-induced deformations, then they are fine-tuned by minimizing the localization error of a US-visible landmark of a realistic breast phantom. The updated model is used to estimate the displacement of other internal lesions due to probe-tissue interaction. Results The localization error obtained when applying the PBD model remains below 11 mm for all the tumors even for input displacements in the order of 30 mm. This proposed method obtains results aligned with FE models with faster computational performance, suitable for real-time applications. In addition, it outperforms rigid model used to track lesion position in US-guided breast biopsies, at least halving the localization error for all the displacement ranges considered.
The execution of surgical tasks by an Autonomous Robotic System (ARS) requires an up-to-date model of the current surgical environment, which has to be deduced from measurements collected during task execution. In this work, we propose to automate tissue dissection tasks by introducing a convolutional neural network, called BA-Net, to predict the location of attachment points between adjacent tissues. BA-Net identifies the attachment areas from a single partial view of the deformed surface, without any a-priori knowledge about their location. The proposed method guarantees a very fast prediction time, which makes it ideal for intra-operative applications. Experimental validation is carried out on both simulated and real world phantom data of soft tissue manipulation performed with the da Vinci Research Kit (dVRK). The obtained results demonstrate that BA-Net provides robust predictions at varying geometric configurations, material properties, distributions of attachment points and grasping point locations. The estimation of attachment points provided by BA-Net improves the simulation of the anatomical environment where the system is acting, leading to a median simulation error below 5mm in all the tested conditions. BA-Net can thus further support an ARS by providing a more robust test bench for the robotic actions intra-operatively, in particular when replanning is needed. The method and collected dataset are available at https://gitlab.com/altairLab/banet.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.