LiDAR-Based Scene Understanding for Navigation in Unstructured Environments

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem KonferenzbandBegutachtung

Abstract

Reliable scene understanding is crucial for autonomous off-road navigation. This work proposes a perception framework based on multiple LiDARs and odometry that is able to analyze a robot’s environment to generate an occupancy grid map for the navigation task. A gradient-based approach separates obstacles and ground points. The exact position of negative obstacles (cliffs and holes) is corrected using geometrical relation. Then, obstacle points are used to create an occupancy grid map for the robot. Observing obstacles are propagated to the next frame to cover blinds spot in the sensor setup, and temporary misclassification and dynamic obstacles are handled using ground points. The proposed framework is tested on a robot with two LiDARs to evaluate the performance. The results show successful navigation in the presence of positive and negative obstacles.

Originalspracheenglisch
TitelAdvances in Service and Industrial Robotics - RAAD 2023
Redakteure/-innenTadej Petrič, Aleš Ude, Leon Žlajpah
Herausgeber (Verlag)Springer, Cham
Seiten178-185
Seitenumfang8
Band135
ISBN (elektronisch)978-3-031-32606-6
ISBN (Print)978-3-031-32605-9
DOIs
PublikationsstatusVeröffentlicht - 2023

Publikationsreihe

NameMechanisms and Machine Science
Band135 MMS
ISSN (Print)2211-0984
ISSN (elektronisch)2211-0992

Fingerprint

Untersuchen Sie die Forschungsthemen von „LiDAR-Based Scene Understanding for Navigation in Unstructured Environments“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren