In this paper we propose an affordable solution to self-localization, which
utilizes visual odometry and road maps as the only inputs. To this end, we
present a probabilistic model as well as an efficient approximate inference
algorithm, which is able to utilize distributed computation to meet the real-time
requirements of autonomous systems. Because of the probabilistic nature of the
model we are able to cope with uncertainty due to noisy visual odometry and
inherent ambiguities in the map (e.g., in a Manhattan world). By exploiting
freely available, community developed maps and visual odometry measurements,
we are able to localize a vehicle up to 3m after only a few seconds of driving
on maps which contain more than 2,150km of drivable roads.
Map-Based Probabilistic Visual
Self-Localization. Marcus A. Brubaker, Andreas Geiger and
Raquel Urtasun. IEEE Transactions on Pattern Analysis and Machine Intelligence, (2016).
(PDF)
Lost! Leveraging the
Crowd for Probabilistic Visual Self-Localization.
Marcus A. Brubaker, Andreas Geiger and Raquel Urtasun. In
Proceedings of IEEE CVPR 2013. (PDF)(Supplemental Material)