Institute of Electrical and Electronic Engineers (IEEE)
2018 International Conference on Unmanned Aircraft Systems (ICUAS)
Data fusion algorithms make it possible to aggregate information from multiple data sources in order to increase the robustness and accuracy of robotic vision systems. While Bayesian fusion methods are common in general applications involving multiple sensors, the computer vision field has largely relegated this approach. In particular, most object following algorithms tend to employ a fixed set of features computed by specialized algorithms, and therefore lack flexibility. In this work, we propose a general hierarchical Bayesian data fusion framework that allows any number of vision-based tracking algorithms to cooperate in the task of estimating the target position. The framework is adaptive in the sense that it responds to variations in the reliability of each individual tracker as estimated by its local statistics as well as by the overall consensus among the trackers. The proposed approach was validated in simulated experiments as well as in two robotic platforms and the experimental results confirm that it can significantly improve the performance of individual trackers.
Echeverri, Andrés F.; Medeiros, Henry P.; Walsh, Ryan; Reznichenko, Yevgeniy Vladimirovich; and Povinelli, Richard J., "Real-time Hierarchical Bayesian Data Fusion for Vision-based Target Tracking with Unmanned Aerial Platforms" (2018). Electrical and Computer Engineering Faculty Research and Publications. 598.