Abstract
Efficient and reliable navigation in off-road environments poses a significant challenge for robotics, especially when factoring in the varying capabilities of robots across different terrains. To achieve this, the robot system's traversability is usually estimated to plan traversable routes through an environment. This paper presents a new approach that utilizes Deep Multimodal Variational Autoencoders (DMVAEs) for estimating the traversability of different robots in complex off-road terrains. Our method utilizes DMVAEs to capture essential environmental information and robot properties, effectively modeling factors that influence robotic traversability. The key contribution of this research is a two-stage traversability estimation framework for various robots in diverse off-road conditions that integrates robot properties in addition to environmental information to predict the traversability for various robots in a single model. We validate our method through real-world experiments involving four ground robots navigating an alpine environment. Comparative evaluations against state-of-the-art traversability estimation methods demonstrate the superior accuracy and robustness of our approach. Additionally, we investigate the transfer of trained models to new robots, enhancing their traversability estimation and extending the applicability of our framework.
Original language | English |
---|---|
Title of host publication | 2024 IEEE International Conference on Robotics and Automation (ICRA) |
Publisher | IEEE Xplore |
Number of pages | 8 |
Publication status | Published - 2024 |
Event | 2024 IEEE International Conference on Robotics and Automation: ICRA 2024 - Yokohama, Japan Duration: 13 May 2024 → 17 May 2024 |
Conference
Conference | 2024 IEEE International Conference on Robotics and Automation |
---|---|
Country/Territory | Japan |
City | Yokohama |
Period | 13/05/24 → 17/05/24 |