Instant Segmentation and Fitting of Excavations in Subsurface Utility Engineering

Research output: Contribution to journalArticlepeer-review

Abstract

Using augmented reality for subsurface utility engineering (SUE) has benefited from recent advances in sensing hardware, enabling the first practical and commercial applications. However, this progress has uncovered a latent problem - the insufficient quality of existing SUE data in terms of completeness and accuracy. In this work, we present a novel approach to automate the process of aligning existing SUE databases with measurements taken during excavation works, with the potential to correct the deviation from the as-planned to as-built documentation, which is still a big challenge for traditional workers at sight. Our segmentation algorithm performs infrastructure segmentation based on the live capture of an excavation on site. Our fitting approach correlates the inferred position and orientation with the existing digital plan and registers the as-planned model into the as-built state. Our approach is the first to circumvent tedious postprocessing, as it corrects data online and on-site. In our experiments, we show the results of our proposed method on both synthetic data and a set of real excavations.

Original languageEnglish
Pages (from-to)1-11
Number of pages11
JournalIEEE Transactions on Visualization and Computer Graphics
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • 3D Models
  • Augmented Reality
  • Documentation
  • Excavation
  • Fitting
  • Geometric Constraints
  • Infrastructure
  • Localization
  • Point cloud compression
  • Segmentation
  • Solid modeling
  • Task analysis
  • Three-dimensional displays

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Instant Segmentation and Fitting of Excavations in Subsurface Utility Engineering'. Together they form a unique fingerprint.

Cite this