WE-D-9A-04: Improving Multi-Modality Image Registration Using Edge-Based Transformations




Multi-modality deformable image registration (DIR) for head & neck (HN) radiotherapy is difficult, particularly when matching computed tomography (CT) scans with magnetic resonance imaging (MRI) scans. We hypothesized that the ‘shared information’ between images of different modalities was to be found in some form of edge-based transformation, and that novel edge-based DIR methods might outperform standard DIR methods.


We propose a novel method that combines gray-scale edge-based morphology and mutual information (MI) in two stages. In the first step, we applied a modification of a previously published mathematical morphology method as an efficient gray scale edge estimator, with denoising function. The results were fed into a MI-based solver (plastimatch). The method was tested on 5 HN patients with pretreatment CT and MR datasets and associated follow-up weekly MR scans. The followup MRs showed significant regression in tumor and normal structure volumes as compared to the pretreatment MRs. The MR images used in this study were obtained using fast spin echo based T2w images with a 1 mm isotropic resolution and FOV matching the CT scan.


In all cases, the novel edge-based registration method provided better registration quality than MI-based DIR using the original CT and MRI images. For example, the mismatch in carotid arteries was reduced from 3–5 mm to within 2 mm. The novel edge-based method with different registration regulation parameters did not show any distorted deformations as compared to the non-realistic deformations resulting from MI on the original images. Processing time was 1.3 to 2 times shorter (edge vs. non-edge). In general, we observed quality improvement and significant calculation time reduction with the new method.


Transforming images to an ‘edge-space,’ if designed appropriately, greatly increases the speed and accuracy of DIR.