Real-Time View Planning for Unstructured Lumigraph Modeling

Okan Erat, Markus Hoell, Karl Haubenwallner, Christian Pirchheim, Dieter Schmalstieg

Research output: Contribution to journalArticlepeer-review


We propose an algorithm for generating an unstructured lumigraph in real-time from an image stream. This problem has important applications in mixed reality, such as telepresence, interior design or as-built documentation. Unlike conventional texture optimization in structure from motion, our method must choose views from the input stream in a strictly incremental manner, since only a small number of views can be stored or transmitted. This requires formulating an online variant of the well-known view-planning problem, which must take into account what parts of the scene have already been seen and how the lumigraph sample distribution could improve in the future. We address this highly unconstrained problem by regularizing the scene structure using a regular grid structure. Upon the grid structure, we define a coverage metric describing how well the lumigraph samples cover the grid in terms of spatial and angular resolution, and we greedily keep incoming views if they improve the coverage. We evaluate the performance of our algorithm quantitatively and qualitatively on a variety of synthetic and real scenes, and demonstrate visually appealing results obtained at real-time frame rates (in the range of 3Hz-100Hz per incoming image, depending on configuration).
Original languageEnglish
Pages (from-to)3063-3072
Number of pages10
JournalIEEE Transactions on Visualization and Computer Graphics
Issue number11
Publication statusPublished - 2019


  • Lumigraph
  • multi-view
  • real-time
  • view planning
  • virtual reality
  • rendering
  • keyframe selection

Fields of Expertise

  • Information, Communication & Computing


Dive into the research topics of 'Real-Time View Planning for Unstructured Lumigraph Modeling'. Together they form a unique fingerprint.

Cite this