2016
DOI: 10.1088/0031-9155/61/20/7377
|View full text |Cite
|
Sign up to set email alerts
|

EVolution: an edge-based variational method for non-rigid multi-modal image registration

Abstract: Image registration is part of a large variety of medical applications including diagnosis, monitoring disease progression and/or treatment effectiveness and, more recently, therapy guidance. Such applications usually involve several imaging modalities such as ultrasound, computed tomography, positron emission tomography, x-ray or magnetic resonance imaging, either separately or combined. In the current work, we propose a non-rigid multi-modal registration method (namely EVolution: an edge-based variational met… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
33
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(36 citation statements)
references
References 25 publications
(27 reference statements)
2
33
1
Order By: Relevance
“…In each dataset, the in‐plane spatial transformation between the anatomical T2‐weighted magnitude image and the first frame of the pCASL magnitude image (ie the reference image chosen previously) was then automatically estimated as follows: an edge‐based variational method for nonrigid multimodal registration was employed to circumvent potential geometrical distortions induced by the EPI readout. For this purpose, the data fidelity term in Equation (left part of the integral) was replaced by a multimodal similarity metric which favors the alignment of edges/gradients that are present in both images, as described in Chen et al The obtained motion‐vector field was subsequently employed to adjust the position of the masks of each brain region.…”
Section: Methodsmentioning
confidence: 99%
“…In each dataset, the in‐plane spatial transformation between the anatomical T2‐weighted magnitude image and the first frame of the pCASL magnitude image (ie the reference image chosen previously) was then automatically estimated as follows: an edge‐based variational method for nonrigid multimodal registration was employed to circumvent potential geometrical distortions induced by the EPI readout. For this purpose, the data fidelity term in Equation (left part of the integral) was replaced by a multimodal similarity metric which favors the alignment of edges/gradients that are present in both images, as described in Chen et al The obtained motion‐vector field was subsequently employed to adjust the position of the masks of each brain region.…”
Section: Methodsmentioning
confidence: 99%
“…This is particularly true when needles are inserted: the insertion imposes a deformation to the liver, and therefore a rigid registration lead to non relevant results. In [20] we propose to use the EVolution algorithm proposed by Denis de Senneville et al [14]. Roughly speaking, the velocity field between the two gradient of the images is regularized by a diffusion term.…”
Section: The Importance Of the Image Registration Algorithmmentioning
confidence: 99%
“…It is therefore the computational reference framework. The ROIs determined on the preoperative imaging are then registered on the CBCT with needles thanks to the non-rigid registration algorithm EVolution of De Senneville et al [7], which is dedicated to multi-modal images registration. Similarly, the ROI of the treatment area observed on the postoperative MRI are registered on the CBCT, which is thus the unique reference framework.…”
Section: 32mentioning
confidence: 99%
“…Registration of the pretreatment ROIs on the CBCT. In order to obtain the geometrical framework of the procedure, we performed the registration of the preoperative image on the CBCT using the EVolution algorithm [7]. This non rigid registration algorithm has been validated on clinical data and the clinical relevance of the registration has been verified by radiologists of the University Hospital J. Verdier.…”
Section: 22mentioning
confidence: 99%
See 1 more Smart Citation