SIFT and Shape Context for Feature-Based Nonlinear Registration of Thoracic CT Images

Martin Urschler, Joachim Bauer, Hendrik Ditt, Horst Bischof

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Nonlinear image registration is a prerequisite for various medical image analysis applications. Many data acquisition protocols suffer from problems due to breathing motion which has to be taken into account for further analysis. Intensity based nonlinear registration is often used to align differing images, however this requires a large computational effort, is sensitive to intensity variations and has problems with matching small structures. In this work a feature-based image registration method is proposed that combines runtime efficiency with good registration accuracy by making use of a fully automatic feature matching and registration approach. The algorithm stages are 3D corner detection, calculation of local (SIFT) and global (Shape Context) 3D descriptors, robust feature matching and calculation of a dense displacement field. An evaluation of the algorithm on seven synthetic and four clinical data sets is presented. The quantitative and qualitative evaluations show lower runtime and superior results when compared to the Demons algorithm.
Original languageEnglish
Title of host publicationComputer Vision Approaches to Medical Image Analysis
Subtitle of host publicationSecond International ECCV Workshop, CVAMIA 2006 Graz, Austria, May 12, 2006 Revised Papers
EditorsReinhard R. Beichel, Milan Sonka
Place of PublicationBerlin Heidelberg
PublisherSpringer
Pages73-84
Volume4241
ISBN (Electronic)978-3-540-46258-3
ISBN (Print)978-3-540-46257-6
DOIs
Publication statusPublished - 2006

Publication series

NameLecture Notes in Computer Science

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Dive into the research topics of 'SIFT and Shape Context for Feature-Based Nonlinear Registration of Thoracic CT Images'. Together they form a unique fingerprint.

Cite this