IMAGE MATCHING OPTIMIZATION VIA VISION AND INERTIAL DATA FUSION: APPLICATION TO NAVIGATION OF THE VISUALLY IMPAIRED
Authors:
EDWIGE PISSALOUX
Université Pierre et Marie Curie (UPMC), Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS UMR 7222, 4 Place Jussieu, Pyramide Tour 55, Boîte Courrier 173, 75 005 Paris, France
YONG CHEN
Université Pierre et Marie Curie (UPMC), Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS UMR 7222, 4 Place Jussieu, Pyramide Tour 55, Boîte Courrier 173, 75 005 Paris, France
RAMIRO VELAZQUEZ
Universidad Panamericana, Mecatrónica y Control de Sistemas, Fracc. Rústicos Calpulli. C.P. 20290, Aguascalientes, Ags., Mexico
Abstract:
Autonomous navigation by humans or robots is usually based on stereo image matching. However, classic image matching methods are not suitable for wearable real-time navigation systems. This paper proposes a new image-matching algorithm which fuses information from both images and inertial data. A preliminary evaluation of the algorithm shows that it is effective for indoor navigation.
Keywords:
Autonomous navigation; image matching; vision-inertial data fusion; navigation assistance for the visually impaired