Interactive-cut: Real-time feedback segmentation for translational research

Jan Egger*, Tobias Lüddemann, Robert Schwarzenberg, Bernd Freisleben, Christopher Nimsky

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this contribution, a scale-invariant image segmentation algorithm is introduced that “wraps” the algorithm's parameters for the user by its interactive behavior, avoiding the definition of “arbitrary” numbers that the user cannot really understand. Therefore, we designed a specific graph-based segmentation method that only requires a single seed-point inside the target-structure from the user and is thus particularly suitable for immediate processing and interactive, real-time adjustments by the user. In addition, color or gray value information that is needed for the approach can be automatically extracted around the user-defined seed point. Furthermore, the graph is constructed in such a way, so that a polynomial-time mincut computation can provide the segmentation result within a second on an up-to-date computer. The algorithm presented here has been evaluated with fixed seed points on 2D and 3D medical image data, such as brain tumors, cerebral aneurysms and vertebral bodies. Direct comparison of the obtained automatic segmentation results with costlier, manual slice-by-slice segmentations performed by trained physicians, suggest a strong medical relevance of this interactive approach.
Original languageEnglish
Pages (from-to)285-295
JournalComputerized Medical Imaging and Graphics
Volume38
Issue number4
DOIs
Publication statusPublished - 2014

Fields of Expertise

  • Sonstiges

Treatment code (Nähere Zuordnung)

  • Application
  • Experimental
  • Popular Scientific

Fingerprint

Dive into the research topics of 'Interactive-cut: Real-time feedback segmentation for translational research'. Together they form a unique fingerprint.

Cite this