Interactive-cut: Real-time feedback segmentation for translational research

Jan Egger*, Tobias Lüddemann, Robert Schwarzenberg, Bernd Freisleben, Christopher Nimsky

*Korrespondierende/r Autor/-in für diese Arbeit

Publikation: Beitrag in einer FachzeitschriftArtikelBegutachtung

Abstract

In this contribution, a scale-invariant image segmentation algorithm is introduced that “wraps” the algorithm's parameters for the user by its interactive behavior, avoiding the definition of “arbitrary” numbers that the user cannot really understand. Therefore, we designed a specific graph-based segmentation method that only requires a single seed-point inside the target-structure from the user and is thus particularly suitable for immediate processing and interactive, real-time adjustments by the user. In addition, color or gray value information that is needed for the approach can be automatically extracted around the user-defined seed point. Furthermore, the graph is constructed in such a way, so that a polynomial-time mincut computation can provide the segmentation result within a second on an up-to-date computer. The algorithm presented here has been evaluated with fixed seed points on 2D and 3D medical image data, such as brain tumors, cerebral aneurysms and vertebral bodies. Direct comparison of the obtained automatic segmentation results with costlier, manual slice-by-slice segmentations performed by trained physicians, suggest a strong medical relevance of this interactive approach.
Originalspracheenglisch
Seiten (von - bis)285-295
FachzeitschriftComputerized Medical Imaging and Graphics
Jahrgang38
Ausgabenummer4
DOIs
PublikationsstatusVeröffentlicht - 2014

Fields of Expertise

  • Sonstiges

Treatment code (Nähere Zuordnung)

  • Application
  • Experimental
  • Popular Scientific

Fingerprint

Untersuchen Sie die Forschungsthemen von „Interactive-cut: Real-time feedback segmentation for translational research“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren