Magnetic resonance (MR) guided high intensity focused ultrasound and external beam radiotherapy interventions, which we shall refer to as beam therapies/interventions, are promising techniques for the non-invasive ablation of tumours in abdominal organs. However, therapeutic energy delivery in these areas becomes challenging due to the continuous displacement of the organs with respiration. Previous studies have addressed this problem by coupling high-framerate MR-imaging with a tracking technique based on the algorithm proposed by Horn and Schunck (H and S), which was chosen due to its fast convergence rate and highly parallelisable numerical scheme. Such characteristics were shown to be indispensable for the real-time guidance of beam therapies. In its original form, however, the algorithm is sensitive to local grey-level intensity variations not attributed to motion such as those that occur, for example, in the proximity of pulsating arteries.In this study, an improved motion estimation strategy which reduces the impact of such effects is proposed. Displacements are estimated through the minimisation of a variation of the H and S functional for which the quadratic data fidelity term was replaced with a term based on the linear L(1)norm, resulting in what we have called an L(2)-L(1) functional.The proposed method was tested in the livers and kidneys of two healthy volunteers under free-breathing conditions, on a data set comprising 3000 images equally divided between the volunteers. The results show that, compared to the existing approaches, our method demonstrates a greater robustness to local grey-level intensity variations introduced by arterial pulsations. Additionally, the computational time required by our implementation make it compatible with the work-flow of real-time MR-guided beam interventions.To the best of our knowledge this study was the first to analyse the behaviour of an L(1)-based optical flow functional in an applicative context: real-time MR-guidance of beam therapies in moving organs.
Image registration is part of a large variety of medical applications including diagnosis, monitoring disease progression and/or treatment effectiveness and, more recently, therapy guidance. Such applications usually involve several imaging modalities such as ultrasound, computed tomography, positron emission tomography, x-ray or magnetic resonance imaging, either separately or combined. In the current work, we propose a non-rigid multi-modal registration method (namely EVolution: an edge-based variational method for non-rigid multi-modal image registration) that aims at maximizing edge alignment between the images being registered. The proposed algorithm requires only contrasts between physiological tissues, preferably present in both image modalities, and assumes deformable/elastic tissues. Given both is shown to be well suitable for non-rigid co-registration across different image types/contrasts (T1/T2) as well as different modalities (CT/MRI). This is achieved using a variational scheme that provides a fast algorithm with a low number of control parameters. Results obtained on an annotated CT data set were comparable to the ones provided by state-of-the-art multi-modal image registration algorithms, for all tested experimental conditions (image pre-filtering, image intensity variation, noise perturbation). Moreover, we demonstrate that, compared to existing approaches, our method possesses increased robustness to transient structures (i.e. that are only present in some of the images).
This study proposes a motion correction strategy for displacements resulting from slowly varying physiological motion that might occur during a MR-guided HIFU intervention. The authors have shown that such drifts can lead to a misalignment between interventional planning, energy delivery, and therapeutic validation. The presented volunteer study and in vivo experiment demonstrate both the relevance of the problem for HIFU therapies and the compatibility of the proposed motion compensation framework with the workflow of a HIFU intervention under clinical conditions.
Medical imaging is currently employed in the diagnosis, planning, delivery and response monitoring of cancer treatments. Due to physiological motion and/or treatment response, the shape and location of the pathology and organs-at-risk may change over time. Establishing their location within the acquired images is therefore paramount for an accurate treatment delivery and monitoring. A feasible solution for tracking anatomical changes during an image-guided cancer treatment is provided by image registration algorithms. Such methods are, however, often built upon elements originating from the computer vision/graphics domain. Since the original design of such elements did not take into consideration the material properties of particular biological tissues, the anatomical plausibility of the estimated deformations may not be guaranteed. In the current work we adapt two existing variational registration algorithms, namely Horn-Schunck and EVolution, to online soft tissue tracking. This is achieved by enforcing an incompressibility constraint on the estimated deformations during the registration process. The existing and the modified registration methods were comparatively tested against several quality assurance criteria on abdominal in vivo MR and CT data. These criteria included: the Dice similarity coefficient (DSC), the Jaccard index, the target registration error (TRE) and three additional criteria evaluating the anatomical plausibility of the estimated deformations. Results demonstrated that both the original and the modified registration methods have similar registration capabilities in high-contrast areas, with DSC and Jaccard index values predominantly in the 0.8-0.9 range and an average TRE of 1.6-2.0 mm. In contrast-devoid regions of the liver and kidneys, however, the three additional quality assurance criteria have indicated a considerable improvement of the anatomical plausibility of the deformations estimated by the incompressibility-constrained methods. Moreover, the proposed registration models maintain the potential of the original methods for online image-based guidance of cancer treatments.
Image-guided external beam radiotherapy (EBRT) allows radiation dose deposition with a high degree of accuracy and precision. Guidance is usually achieved by estimating the displacements, via image registration, between cone beam computed tomography (CBCT) and computed tomography (CT) images acquired at different stages of the therapy. The resulting displacements are then used to reposition the patient such that the location of the tumor at the time of treatment matches its position during planning. Moreover, ongoing research aims to use CBCT-CT image registration for online plan adaptation. However, CBCT images are usually acquired using a small number of x-ray projections and/or low beam intensities. This often leads to the images being subject to low contrast, low signal-to-noise ratio and artifacts, which ends-up hampering the image registration process. Previous studies addressed this by integrating additional image processing steps into the registration procedure. However, these steps are usually designed for particular image acquisition schemes, therefore limiting their use on a case-by-case basis. In the current study we address CT to CBCT and CBCT to CBCT registration by the means of the recently proposed EVolution registration algorithm. Contrary to previous approaches, EVolution does not require the integration of additional image processing steps in the registration scheme. Moreover, the algorithm requires a low number of input parameters, is easily parallelizable and provides an elastic deformation on a point-by-point basis. Results have shown that relative to a pure CT-based registration, the intrinsic artifacts present in typical CBCT images only have a sub-millimeter impact on the accuracy and precision of the estimated deformation. In addition, the algorithm has low computational requirements, which are compatible with online image-based guidance of EBRT treatments.
Hyperthermia treatment planning (HTP) is valuable to optimize tumor heating during thermal therapy delivery. Yet, clinical hyperthermia treatment plans lack quantitative accuracy due to uncertainties in tissue properties and modeling, and report tumor absorbed power and temperature distributions which cannot be linked directly to treatment outcome. Over the last decade, considerable progress has been made to address these inaccuracies and therefore improve the reliability of hyperthermia treatment planning. Patient-specific electrical tissue conductivity derived from MR measurements has been introduced to accurately model the power deposition in the patient. Thermodynamic fluid modeling has been developed to account for the convective heat transport in fluids such as urine in the bladder. Moreover, discrete vasculature trees have been included in thermal models to account for the impact of thermally significant large blood vessels. Computationally efficient optimization strategies based on SAR and temperature distributions have been established to calculate the phase-amplitude settings that provide the best tumor thermal dose while avoiding hot spots in normal tissue. Finally, biological modeling has been developed to quantify the hyperthermic radiosensitization effect in terms of equivalent radiation dose of the combined radiotherapy and hyperthermia treatment. In this paper, we review the present status of these developments and illustrate the most relevant advanced elements within a single treatment planning example of a cervical cancer patient. The resulting advanced HTP workflow paves the way for a clinically feasible and more reliable patient-specific hyperthermia treatment planning.
BackgroundDuring lengthy magnetic resonance-guided high intensity focused ultrasound (MRg-HIFU) thermal ablations in abdominal organs, the therapeutic work-flow is frequently hampered by various types of physiological motion occurring at different time-scales. If left un-addressed this can lead to an incomplete therapy and/or to tissue damage of organs-at-risk. While previous studies focus on correction schemes for displacements occurring at a particular time-scale within the work-flow of an MRg-HIFU therapy, in the current work we propose a motion correction strategy encompassing the entire work-flow.MethodsThe proposed motion compensation framework consists of several linked components, each being adapted to motion occurring at a particular time-scale. While respiration was addressed through a fast correction scheme, long term organ drifts were compensated using a strategy operating on time-scales of several minutes. The framework relies on a periodic examination of the treated area via MR scans which are then registered to a reference scan acquired at the beginning of the therapy. The resulting displacements were used for both on-the-fly re-optimization of the interventional plan and to ensure the spatial fidelity between the different steps of the therapeutic work-flow. The approach was validated in three complementary studies: an experiment conducted on a phantom undergoing a known motion pattern, a study performed on the abdomen of 10 healthy volunteers and during 3 in-vivo MRg-HIFU ablations on porcine liver.ResultsResults have shown that, during lengthy MRg-HIFU thermal therapies, the human liver and kidney can manifest displacements that exceed acceptable therapeutic margins. Also, it was demonstrated that the proposed framework is capable of providing motion estimates with sub-voxel precision and accuracy. Finally, the 3 successful animal studies demonstrate the compatibility of the proposed approach with the work-flow of an MRg-HIFU intervention under clinical conditions.ConclusionsIn the current study we proposed an image-based motion compensation framework dedicated to MRg-HIFU thermal ablations in the abdomen, providing the possibility to re-optimize the therapy plan on-the-fly with the patient on the interventional table. Moreover, we have demonstrated that even under clinical conditions, the proposed approach is fully capable of continuously ensuring the spatial fidelity between the different phases of the therapeutic work-flow.
scite is a Brooklyn-based startup that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2023 scite Inc. All rights reserved.
Made with 💙 for researchers