Direct 3D model-based tracking in omnidirectional images robust to large inter-frame motion - Université de Picardie Jules Verne Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Direct 3D model-based tracking in omnidirectional images robust to large inter-frame motion

Résumé

This paper tackles direct 3D model-based pose tracking. It considers the Photometric Gaussian Mixtures (PGM) transform of omnidirectional images as direct features. The contributions include an adaptation of the pose optimization to omnidirectional cameras and a rethink of the initialization and optimization rules of the PGM extent. These enhancements produce a giant leap in the convergence domain width. Application to images acquired onboard a mobile robot within an urban environment described by a large 3D colored point cloud shows significant robustness to large inter-frame motion, compared to approaches that directly use pixel brightness as direct features.
Fichier principal
Vignette du fichier
2021_ICAR_Guerbas.pdf (3.61 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03371352 , version 1 (29-10-2021)

Identifiants

Citer

Seif-Eddine Guerbas, Nathan Crombez, Guillaume Caron, El Mustapha Mouaddib. Direct 3D model-based tracking in omnidirectional images robust to large inter-frame motion. IEEE International Conference on Advanced Robotics, Dec 2021, Ljubjana, Slovenia. ⟨10.1109/ICAR53236.2021.9659324⟩. ⟨hal-03371352⟩
126 Consultations
121 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More