Aerial Road Segmentation in the Presence of Topological Label Noise

Corentin Henry, Friedrich Fraundorfer, Vig Eleonora

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

The availability of large-scale annotated datasets has enabled Fully-Convolutional Neural Networks to reach outstanding performance on road extraction in aerial images. However, high-quality pixel-level annotation is expensive to produce and even manually labeled data often contains topological errors. Trading off quality for quantity, many datasets rely on already available yet noisy labels, for example from OpenStreetMap. In this paper, we explore the training of custom U-Nets built with ResNet and DenseNet backbones using noise-aware losses that are robust towards label omission and registration noise. We perform an extensive evaluation of standard and noise-aware losses, including a novel Bootstrapped DICE-Coefficient loss, on two challenging road segmentation benchmarks. Our losses yield a consistent improvement in overall extraction quality and exhibit a strong capacity to cope with severe label noise. Our method generalizes well to two other fine-grained topology delineation tasks: surface crack detection for quality inspection and cell membrane extraction in electron microscopy imagery
Original languageEnglish
Title of host publicationICPR 2020
Publication statusPublished - 12 Jan 2021
Event25th International Conference on Pattern Recognition - Milano, Virtuell, Italy
Duration: 10 Jan 202115 Feb 2021

Conference

Conference25th International Conference on Pattern Recognition
Abbreviated titleICPR 2020
Country/TerritoryItaly
CityVirtuell
Period10/01/2115/02/21

Keywords

  • label noise
  • road extraction
  • semantic segmentation
  • satellite imagery
  • aerial imagery

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Dive into the research topics of 'Aerial Road Segmentation in the Presence of Topological Label Noise'. Together they form a unique fingerprint.

Cite this