UNOC: Understanding Occlusion for Embodied Presence in Virtual Reality

Mathias Parger, Chengcheng Tang, Yuanlu Xu, Christopher David Twigg, Lingling Tao, Yijing Li, Robert Wang, Markus Steinberger

Publikation: Beitrag in einer FachzeitschriftArtikelBegutachtung

Abstract

Tracking body and hand motions in 3D space is essential for social and self-presence in augmented and virtual environments. Unlike the popular 3D pose estimation setting, the problem is often formulated as egocentric tracking based on embodied perception (e.g., egocentric cameras, handheld sensors). In this paper, we propose a new data-driven framework for egocentric body tracking, targeting challenges of omnipresent occlusions in optimization-based methods (e.g., inverse kinematics solvers). We first collect a large-scale motion capture dataset with both body and finger motions using optical markers and inertial sensors. This dataset focuses on social scenarios and captures ground truth poses under self-occlusions and body-hand interactions. We then simulate the occlusion patterns in head-mounted camera views on the captured ground truth using a ray casting algorithm and learn a deep neural network to infer the occluded body parts. Our experiments show that our method is able to generate high-fidelity embodied poses by applying the proposed method to the task of real-time egocentric body tracking, finger motion synthesis, and 3-point inverse kinematics.

Originalspracheenglisch
FachzeitschriftIEEE Transactions on Visualization and Computer Graphics
Frühes Online-Datum2021
DOIs
PublikationsstatusElektronische Veröffentlichung vor Drucklegung. - 2021

ASJC Scopus subject areas

  • Software
  • Signalverarbeitung
  • Maschinelles Sehen und Mustererkennung
  • Computergrafik und computergestütztes Design

Fingerprint

Untersuchen Sie die Forschungsthemen von „UNOC: Understanding Occlusion for Embodied Presence in Virtual Reality“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren