Documents
Presentation Slides
From Mapping to Localization: A Complete Framework to Visually Estimate Position and Attitude for Autonomous Vehicles
- Citation Author(s):
- Submitted by:
- Guoyu Lu
- Last updated:
- 22 September 2019 - 12:32am
- Document Type:
- Presentation Slides
- Document Year:
- 2019
- Event:
- Categories:
- Log in to post comments
Autonomous vehicle framework relies on localization algorithms to position itself and navigates to the destination. In this paper, we explore a light-weight visual localization method to realize the vehicle position and attitude estimation based on images rather than the dominant LIDAR data. We apply SLAM and an offline map correction method to generate a high precision map, which composes 3D points and feature descriptors. For each image, we extract the features and match against the map to explore correspondences. In the correspondences search process, we rely on the previous camera pose estimation result to determine the search scope, where significantly improves the localization accuracy. The searching process is embedded in the pose estimation stage, which we adjust the PnP procedure to better fit the autonomous driving task. Simply based on a single CPU thread support, experiments on the benchmark KITTI dataset demonstrate the superior results of our method.
Comments
Presentation slide for ICIP
Presentation slide for ICIP paper "From Mapping to Localization: A Complete Framework to Visually Estimate Position and Attitude for Autonomous Vehicles"