Document Type

Conference Proceeding



Publication Date



Institute of Electrical and Electronic Engineers (IEEE)

Source Publication

2018 International Conference on Unmanned Aircraft Systems (ICUAS)

Source ISSN



Data fusion algorithms make it possible to aggregate information from multiple data sources in order to increase the robustness and accuracy of robotic vision systems. While Bayesian fusion methods are common in general applications involving multiple sensors, the computer vision field has largely relegated this approach. In particular, most object following algorithms tend to employ a fixed set of features computed by specialized algorithms, and therefore lack flexibility. In this work, we propose a general hierarchical Bayesian data fusion framework that allows any number of vision-based tracking algorithms to cooperate in the task of estimating the target position. The framework is adaptive in the sense that it responds to variations in the reliability of each individual tracker as estimated by its local statistics as well as by the overall consensus among the trackers. The proposed approach was validated in simulated experiments as well as in two robotic platforms and the experimental results confirm that it can significantly improve the performance of individual trackers.


Accepted version. "Real-time Hierarchical Bayesian Data Fusion for Vision-based Target Tracking with Unmanned Aerial Platforms," in 2018 International Conference on Unmanned Aircraft Systems (ICUAS) (2018): 1262-1270. DOI. © 2018 Institute of Electrical and Electronic Engineers (IEEE). Used with permission.

medeiros_13337acc.docx (676 kB)
ADA Accessible Version